# Thread: Two problems: expectation inequality and convergence.

1. ## Two problems: expectation inequality and convergence.

1) Let $\Phi (x)$ a real valued function with continuous and positive second derivative in every point.
Let $X$ be a random variable such that its expectation and that of the random variable $\Phi (X)$ exists.
Prove that $\Phi (EX) \leq E\Phi (X)$

2) Prove that if $X_n$ converges to $X$ in $L^p$ norm then $X_n$ converges to $X$ in probability.

Thanks, guys!!

2. Hello,

For question 1), this is just Jensen's inequality...
Since its second derivative is positive, the function is convex and hence this inequality can be applied.

I guess it would be a good thing to try using what you proved in question 1) for question 2). I'm too tired to try it now.

By the way, it's highly not recommended to bump

3. Sorry for the bumping thing, I wasn't aware of that.

And yes, I know that I have to use Jensen's inequality (it was to be proven earlier), but I don't know how... Is it a straightforward computation that I'm not seeing?

And for part 2) I'm completely clueless... I've heard that I have to use Markov's inequality, but again, I have no idea.

As you can see, I'm an awful inequaliser...

4. Well, Jensen's inequality - Wikipedia, the free encyclopedia there's the measure theoric form, which you need here.

It basically says that for a convex function $\phi$, which is the case here since its second derivative is always positive, $\phi\left(\int_{\Omega} f ~d\mu\right)\leq \int_{\Omega} \phi \circ f ~d\mu$, where $\mu$ is a probability measure.

So here just take $f(x)=x$ and refer to the definition of the expectation :
$\mathbb{E}(h(X))=\int_{\Omega} h(x) ~\mu(dx)$

Oh for the second one...
You know that $X_n$ converges to X in probability iff $\forall \varepsilon>0 ~,~\lim_{n\to\infty} \mathbb{P}(|X_n-X|>\varepsilon)=0$

Since $X_n \to X$ in $L^p$, this means that $\mathbb{E}(|X_n-X|^p) \to 0$
This means that $\forall \epsilon, \exists N, \forall n>N, \mathbb{E}(|X_n-X|^p)<\epsilon$

Let $\varepsilon>0$ :

$\mathbb{P}(|X_n-X|>\varepsilon)=\mathbb{P}(|X_n-X|^p>\varepsilon^p)$

By Markov's inequality, can you conclude ?