# Chebyshev's inequality

• Jun 7th 2010, 12:56 PM
Veve
Chebyshev's inequality
I guess this is not a difficult problem, but I can not find the function for which to use Chebyshev's inequality...

Assume X is a random variable with property that $\displaystyle \mathbb{E}[X]=0 \;\;and\;\;\mathbb{E}[X^2]=\sigma^2$. Use Chebyshev's inequality for the random variable $\displaystyle Y=(X+c)^2,c>0$ to prove that
$\displaystyle \mathbb{P}(X>x)\le\frac{\sigma^2}{x^2+\sigma^2}$ for $\displaystyle x>0$.

Thanks.
• Jun 7th 2010, 01:07 PM
Laurent
Quote:

Originally Posted by Veve
I guess this is not a difficult problem, but I can not find the function for which to use Chebyshev's inequality...

Assume X is a random variable with property that $\displaystyle \mathbb{E}[X]=0 \;\;and\;\;\mathbb{E}[X^2]=\sigma^2$. Use Chebyshev's inequality for the random variable $\displaystyle Y=(X+c)^2,c>0$ to prove that
$\displaystyle \mathbb{P}(X>x)\le\frac{\sigma^2}{x^2+\sigma^2}$ for $\displaystyle x>0$.

Thanks.

You have, using Chebyshev's inequality, for $\displaystyle x,c>0$, $\displaystyle P(X>x)=P(X+c>x+c)\leq P((X+c)^2>(x+c)^2)\leq \frac{E[(X+c)^2]}{(x+c)^2}$. Now, expand the square inside the expectation; then the right-hand side is a function of $\displaystyle c$, while the left-hand side isn't. The "best" (sharpest) inequality is the one when the right-hand side is lowest, so you have to choose $\displaystyle c$ which minimizes the right-hand side. This resumes to a standard function study (derivative, etc.) to find this minimum.