Results 1 to 4 of 4

Math Help - Two problems: expectation inequality and convergence.

  1. #1
    Junior Member
    Joined
    Nov 2008
    Posts
    50

    Two problems: expectation inequality and convergence.

    1) Let \Phi (x) a real valued function with continuous and positive second derivative in every point.
    Let X be a random variable such that its expectation and that of the random variable \Phi (X) exists.
    Prove that \Phi (EX) \leq E\Phi (X)

    2) Prove that if X_n converges to X in L^p norm then X_n converges to X in probability.



    Thanks, guys!!
    Follow Math Help Forum on Facebook and Google+

  2. #2
    Moo
    Moo is offline
    A Cute Angle Moo's Avatar
    Joined
    Mar 2008
    From
    P(I'm here)=1/3, P(I'm there)=t+1/3
    Posts
    5,618
    Thanks
    6
    Hello,

    For question 1), this is just Jensen's inequality...
    Since its second derivative is positive, the function is convex and hence this inequality can be applied.

    I guess it would be a good thing to try using what you proved in question 1) for question 2). I'm too tired to try it now.

    By the way, it's highly not recommended to bump
    Follow Math Help Forum on Facebook and Google+

  3. #3
    Junior Member
    Joined
    Nov 2008
    Posts
    50
    Sorry for the bumping thing, I wasn't aware of that.

    And yes, I know that I have to use Jensen's inequality (it was to be proven earlier), but I don't know how... Is it a straightforward computation that I'm not seeing?

    And for part 2) I'm completely clueless... I've heard that I have to use Markov's inequality, but again, I have no idea.

    As you can see, I'm an awful inequaliser...
    Follow Math Help Forum on Facebook and Google+

  4. #4
    Moo
    Moo is offline
    A Cute Angle Moo's Avatar
    Joined
    Mar 2008
    From
    P(I'm here)=1/3, P(I'm there)=t+1/3
    Posts
    5,618
    Thanks
    6
    Well, Jensen's inequality - Wikipedia, the free encyclopedia there's the measure theoric form, which you need here.

    It basically says that for a convex function \phi, which is the case here since its second derivative is always positive, \phi\left(\int_{\Omega} f ~d\mu\right)\leq \int_{\Omega} \phi \circ f ~d\mu, where \mu is a probability measure.

    So here just take f(x)=x and refer to the definition of the expectation :
    \mathbb{E}(h(X))=\int_{\Omega} h(x) ~\mu(dx)



    Oh for the second one...
    You know that X_n converges to X in probability iff \forall \varepsilon>0 ~,~\lim_{n\to\infty} \mathbb{P}(|X_n-X|>\varepsilon)=0

    Since X_n \to X in L^p, this means that \mathbb{E}(|X_n-X|^p) \to 0
    This means that \forall \epsilon, \exists N, \forall n>N, \mathbb{E}(|X_n-X|^p)<\epsilon


    Let \varepsilon>0 :

    \mathbb{P}(|X_n-X|>\varepsilon)=\mathbb{P}(|X_n-X|^p>\varepsilon^p)

    By Markov's inequality, can you conclude ?
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. Expectation inequality probelems
    Posted in the Advanced Statistics Forum
    Replies: 3
    Last Post: December 17th 2010, 08:55 AM
  2. convergence in probability iff expectation...
    Posted in the Advanced Statistics Forum
    Replies: 3
    Last Post: December 13th 2010, 02:43 PM
  3. Schwarz inequality for conditional expectation
    Posted in the Advanced Statistics Forum
    Replies: 2
    Last Post: March 11th 2010, 03:22 AM
  4. Expectation problems
    Posted in the Advanced Statistics Forum
    Replies: 2
    Last Post: January 12th 2010, 09:30 AM
  5. Inequality problems
    Posted in the Algebra Forum
    Replies: 1
    Last Post: February 26th 2008, 06:07 PM

Search Tags


/mathhelpforum @mathhelpforum