Results 1 to 2 of 2

Math Help - Convergence it Distribution

  1. #1
    Junior Member
    Joined
    Nov 2008
    Posts
    53

    Convergence it Distribution

    Let the r.v Yn have a dist that is b(n,p)

    (a) Prove that Yn / n converges in probability p. This result is one form of the weak law of large numbers.

    (b) Prove that 1 - Yn / n converges in probability to 1-p

    (c) Prove that (Yn / n)(1-Yn / n) converges in probability to p(1-p).

    Follow Math Help Forum on Facebook and Google+

  2. #2
    MHF Contributor matheagle's Avatar
    Joined
    Feb 2009
    Posts
    2,763
    Thanks
    5
    Okay, the first is just chebyshev's
    P(|Y/n-p|>epsilon)=P(|Y-np|>n epsilon)
    Use Chebyshev's we have
    <V(Y)/(n^2 Epsilon^2)
    =(npq)/(n^2 Epsilon^2)
    =(pq)/(n Epsilon^2)
    which goes to zero as n->infinity
    We can quote result someone else asked me last week.
    If The variance is finite and the stat is unbiased then it consistent.
    Thats what I just did, here.

    --------------------------------------------------------------------------------

    I'm going to make dinner.
    I'll do parts b,c later.

    By the way this is not convergence in distribution, this is a stronger mode of convergence, convergence in probability.
    Convergence in distribution is where a sequence of cdfs converges to a cdf, like in the central limit theorem.
    LOL, see
    http://www.statisticalengineering.com/convergence.htm
    "Convergence in probability" is not quite the same as convergence in distribution.
    this is quite useful...
    http://en.wikipedia.org/wiki/Converg...ndom_variables

    ----------------------------------------------------------------------------------

    Part b can be done two ways.
    It's basically the same as a, just switch what is a success and a failure
    That changes p and q and you're done.
    However, let epsilon>0...
    P(|(1-Y/n)-(1-p)|>epsilon)
    =P(|-Y/n+p|>epsilon)
    =P(|Y/n-p|>epsilon) since |-1|=1
    =P(|Y-np|>n epsilon)...
    the rest is as b4.

    -------------------------------------------------------------------------------

    Part c follows from...
    If x_n->a and y_n->b (both in probability),
    then (x_n)(y_n) ->ab, in probability.
    I'm trying to prove your problem directly via the triangle inequality.
    Let epsilon>0...
    P(|(Y/n)(1-Y/n)-p(1-p)|>epsilon)
    =P(|(Y/n)-(Y/n)^2-p+p^2)|>epsilon)
    =P(|[(Y/n)-p] + [p^2-(Y/n)^2]|>epsilon) now the Triangle
    <=P(|(Y/n)-p|>epsilon/2)+P(|(Y/n)^2-p^2|>epsilon/2).
    Since the sum is greater than epsilon, at least one of them in abs value must be greater than epsilon/2.
    Now we've already shown the first term goes to zero.
    The second term goes to zero since the square of a sequence going to p has their terms going to p^2 in probability too.
    See, since Y/n->p in probability, the square does too.
    In general if x_n->a in prob and g(.) is any continuous function
    then g(x_n)->g(a) in prob
    That can be found at...
    http://planetmath.org/encyclopedia/C...ormations.html
    Last edited by mr fantastic; February 22nd 2009 at 01:59 AM. Reason: Merged posts
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. convergence in distribution
    Posted in the Advanced Statistics Forum
    Replies: 7
    Last Post: December 30th 2009, 12:49 AM
  2. Convergence in Distribution
    Posted in the Advanced Statistics Forum
    Replies: 15
    Last Post: December 19th 2009, 06:46 PM
  3. Convergence in Distribution
    Posted in the Advanced Statistics Forum
    Replies: 6
    Last Post: November 7th 2009, 03:33 PM
  4. convergence in distribution
    Posted in the Advanced Statistics Forum
    Replies: 0
    Last Post: August 12th 2009, 08:37 AM
  5. Convergence in Distribution
    Posted in the Advanced Statistics Forum
    Replies: 0
    Last Post: March 27th 2009, 08:13 PM

Search Tags


/mathhelpforum @mathhelpforum