# Cauchy distribution & tail estimate

• November 17th 2009, 09:46 AM
akbar
Cauchy distribution & tail estimate
Let $\xi_1$, $\xi_2$... be independent identically distributed random variables with the Cauchy distribution. How do you prove that:

$\liminf_{n\rightarrow\infty}\mathbb{P}(\max(\xi_1, ...,\xi_n)> xn) \geq \exp(-\pi x)$

for any $x\geq 0$ ?

The Cauchy distribution is given by the density:

$f(u)=\frac{1}{\pi (1+u^2)}$ , $u \in\mathbb{R}$

And has no expectation or variance defined.

Thanks for any help.
• November 17th 2009, 02:26 PM
Laurent
Quote:

Originally Posted by akbar
Let $\xi_1$, $\xi_2$... be independent identically distributed random variables with the Cauchy distribution. How do you prove that:

$\liminf_{n\rightarrow\infty}\mathbb{P}(\max(\xi_1, ...,\xi_n)> xn) \geq \exp(-\pi x)$

for any $x\geq 0$ ?

This is not only a liminf. I just did the usual procedure with this kind of problem, and it gives the limit. We have:

$P(\xi>T)=\int_T^\infty \frac{dt}{\pi(1+t^2)}=$ $\frac{1}{\pi}\arctan\frac{1}{T}\sim_{T\to\infty}\f rac{1}{\pi T}$

Hence $P(\max \xi_1,\ldots,\xi_n >xn)=1-P(\xixn))^n=$ $1-\exp(n\log\left(1-\frac{1}{\pi x n}+o\left(\frac{1}{n}\right)\right))=1-\exp(-\frac{1}{\pi x}+o(1))$, which gives:

$P(\max \xi_1,\ldots,\xi_n> xn)\to_n 1-e^{-\frac{1}{\pi x}}$.

As a matter of fact, it seems that indeed $1-e^{-\frac{1}{\pi x}}\geq e^{-\pi x}$ (study the function $x\mapsto e^{-x}+e^{-1/x}$ to prove it is less than 1? Maybe not easy, anyway the plots convinces me. ) I guess the question suggests a more direct way, but since the above limit is stronger... it satisfies me ;)
• November 17th 2009, 02:54 PM
akbar
I concur. This is the result I found, but I wasn't entirely sure since the problem wanted a limit inferior, while obviously you could get a limit. Unfortunately this book is plagued with errors, so I wouldn't be surprised if this is yet another one.

On the function comparison side, a quick plot on Maxima confirms the inequality. Alternatively, you can get the derivative of the function:
$f(x)=exp(-x)+exp(-1/x)$
and check that the function, starting from 1, decreases close to zero and increases towards infinity to 1 again. So the function stays below 1, although this is not a full proof.

Many Thanks.
• November 17th 2009, 03:58 PM
Laurent
Quote:

Originally Posted by akbar
I concur. This is the result I found, but I wasn't entirely sure since the problem wanted a limit inferior, while obviously you could get a limit. Unfortunately this book is plagued with errors, so I wouldn't be surprised if this is yet another one.

On the function comparison side, a quick plot on Maxima confirms the inequality. Alternatively, you can get the derivative of the function:
$f(x)=exp(-x)+exp(-1/x)$
and check that the function, starting from 1, decreases close to zero and increases towards infinity to 1 again. So the function stays below 1, although this is not a full proof.

Many Thanks.

a) I would really have appreciated your mentioning your findings beforehand. I took some time to write a clean proof because discussions in your previous thread displayed interesting problems and suggested that you would have mentioned your attempts or doubts. Fortunatly this wasn't that long; this is mostly for next time... ;)
b) Rather than a typo, I would guess that the authors have a simple and perhaps clever proof of what they claim, probably based on an equation in the chapter where this problem comes from. The fact that the limiting distribution stochastically dominates an exponential distribution may have a good reason to be, even though I don't know which one. But, as I wrote, I don't really bother tonight. Except if the proof is really nice...
• November 17th 2009, 04:17 PM
akbar
a) Let me assure you I didn't find anything beforehand when I posted this thread (frankly I was quite desperate after a couple of days of tinkering). Otherwise I would have mentioned it. Obviously it looks like the mere posting of the problem led me to the obvious truth (maybe same time as you?). This forum has undoubted virtues.
In any case apologies for the trouble.
b) The chapter precisely uses the "usual way" of doing to find directly the limit. Which makes the bound even more cumbersome. But who knows.

Let me take advantage of my 1-hour lag to wish you a good night.
• November 18th 2009, 02:18 PM
Laurent
Thanks for the clarification; it is unfortunately too frequent that posters just drop their homework questions on the forum without further thought, for people to work in their place... I'm glad you don't belong to them!

Good night to you too ! (Sleepy)