# Math Help - Chebyshev's inequality

1. ## Chebyshev's inequality

I guess this is not a difficult problem, but I can not find the function for which to use Chebyshev's inequality...

Assume X is a random variable with property that $\mathbb{E}[X]=0 \;\;and\;\;\mathbb{E}[X^2]=\sigma^2$. Use Chebyshev's inequality for the random variable $Y=(X+c)^2,c>0$ to prove that
$\mathbb{P}(X>x)\le\frac{\sigma^2}{x^2+\sigma^2}$ for $x>0$.

Thanks.

2. Originally Posted by Veve
I guess this is not a difficult problem, but I can not find the function for which to use Chebyshev's inequality...

Assume X is a random variable with property that $\mathbb{E}[X]=0 \;\;and\;\;\mathbb{E}[X^2]=\sigma^2$. Use Chebyshev's inequality for the random variable $Y=(X+c)^2,c>0$ to prove that
$\mathbb{P}(X>x)\le\frac{\sigma^2}{x^2+\sigma^2}$ for $x>0$.

Thanks.
You have, using Chebyshev's inequality, for $x,c>0$, $P(X>x)=P(X+c>x+c)\leq P((X+c)^2>(x+c)^2)\leq \frac{E[(X+c)^2]}{(x+c)^2}$. Now, expand the square inside the expectation; then the right-hand side is a function of $c$, while the left-hand side isn't. The "best" (sharpest) inequality is the one when the right-hand side is lowest, so you have to choose $c$ which minimizes the right-hand side. This resumes to a standard function study (derivative, etc.) to find this minimum.