Results 1 to 5 of 5

Thread: 2 convergence in distribution questions

  1. #1
    yui
    yui is offline
    Newbie
    Joined
    Sep 2008
    Posts
    4

    2 convergence in distribution questions

    First one:
    Suppose $\displaystyle X_n \sim Bin(n, p)$. Show that as $\displaystyle n \rightarrow \infty$
    $\displaystyle
    \frac{X^2_n - n^2 p^2}{2(pn)^\frac{3}{2} \sqrt{1-p})} \stackrel{D}{\rightarrow} Z \sim N(0,1)
    $
    In a previous question I proved that $\displaystyle \frac{X_n - np}{\sqrt{np(1-p)}} \stackrel{D}{\rightarrow} Z \sim N(0,1)$, so for this one, do I just have to show that the left hand side is $\displaystyle X^2_n$ and that from the continuous mapping theorem, it also converges in distribution to Z as well. But I got stuck on showing that the left hand side is $\displaystyle X^2_n$ so I'm not sure if that's the way to go.


    Second one:
    Show that $\displaystyle (1-X_n)^{-1} \stackrel{D}{\rightarrow} (1-X)^{-1}$ given that $\displaystyle X_n \sim P(X_n = \frac{i}{n}) = \frac{1}{n}, i = 0, 1, 2, 3, .. n-1$
    I tried to prove it using the MGF of Xn but I got stuck on the algebra after
    $\displaystyle
    E(e^{\lambda X_n}) = \sum_{x=\frac{i}{n}}^{\frac{n-1}{n}} e^{\lambda x} P(X=x)
    $
    I'm not even sure the sum is right since I didn't write anything down for i. Is using the MGF the way to go or should I go back to the definition or use some other method?
    Follow Math Help Forum on Facebook and Google+

  2. #2
    MHF Contributor

    Joined
    Aug 2008
    From
    Paris, France
    Posts
    1,174
    For your first question, the key trick is to write $\displaystyle X_n^2-n^2p^2=(X_n-np)(X_n+np)$. You know that $\displaystyle X_n-np$, suitably normalized, converges in distribution. As for $\displaystyle X_n+np$, the LLN shows that $\displaystyle \frac{X_n+np}{np}=\frac{X_n}{np}+1$ converges almost-surely (and hence in distribution) to 2.

    For the second one, do you have an intuition of what $\displaystyle X_n$ looks like? Once you have it, prove that $\displaystyle (X_n)_n$ converges in distribution to you-guessed-what, using the cumulative distribution function (which I prefer) or the MGF. Then deduce the result about $\displaystyle \frac{1}{1-X}$.

    By the way, the MGF is $\displaystyle E[e^{\lambda X_n}]=\sum_{i=0}^{n-1}e^{\lambda\frac{i}{n}}P(X_n=\frac{i}{n})=\sum_{i =0}^{n-1}e^{\lambda\frac{i}{n}}\frac{1}{n}$. You can simplify this expression (sum of $\displaystyle n$ terms in a geometric sequence...).
    Follow Math Help Forum on Facebook and Google+

  3. #3
    yui
    yui is offline
    Newbie
    Joined
    Sep 2008
    Posts
    4
    Ok so for the first one:
    $\displaystyle
    \frac{X^2_n - n^2 p^2}{2(pn)^\frac{3}{2} \sqrt{1-p})} = \frac{X_n - np}{\sqrt{np(1-p)}} \frac{X_n + np}{np} \frac{1}{2}
    $

    $\displaystyle
    \frac{X_n+np}{np} = \frac{X_n}{np} +1
    $ converges to 2
    In a previous question I found that $\displaystyle \frac{X_n - np}{\sqrt{np(1-p)}}$ converged in distribution to N(0,1), then the above equation converges to N(0,1) as well.


    I'm still a bit lost on the second question though.
    Xn is discrete normal (?) from 0-1 with probability 1/n then X is continuous normal from 0-1 as well.
    $\displaystyle
    E[e^{\lambda X_n}] = \sum_{i=0}^{n-1} e^{\lambda\frac{i}{n}}\frac{1}{n} = \frac{n-1}{n} \sum_{i=0}^{n-1} e^{\lambda\frac{i}{n}}
    $

    $\displaystyle
    E[e^{\lambda X}] = \int_{0}^{1} e^{\lambda X} dx = \frac{e^{\lambda x} - 1}{\lambda}
    $
    Not sure what to do with the sigma part in the first MGF still.
    Follow Math Help Forum on Facebook and Google+

  4. #4
    MHF Contributor

    Joined
    Aug 2008
    From
    Paris, France
    Posts
    1,174
    Second part: $\displaystyle X_n$ is chosen uniformly in $\displaystyle \{0,\frac{1}{n},\ldots,\frac{n-1}{n}\}$. As $\displaystyle n$ tends to $\displaystyle +\infty$, $\displaystyle X_n$ looks more and more like it were uniformly distributed on $\displaystyle [0,1]$, doesn't it? There's no normal distribution here.

    $\displaystyle E[e^{\lambda X_n}] = \sum_{i=0}^{n-1} e^{\lambda\frac{i}{n}}\frac{1}{n} = \frac{1}{n}\sum_{i=0}^{n-1}\left(e^{\frac{\lambda}{n}}\right)^i=\frac{1}{n} \frac{e^{\lambda}-1}{e^{\frac{\lambda}{n}}-1}$ and $\displaystyle e^{\frac{\lambda}{n}}-1\sim \frac{\lambda}{n}$ as $\displaystyle n$ tends to $\displaystyle \infty$...
    Follow Math Help Forum on Facebook and Google+

  5. #5
    yui
    yui is offline
    Newbie
    Joined
    Sep 2008
    Posts
    4
    Oops, I meant discrete and continuous uniform, must've been mixed up from the first question

    $\displaystyle \frac{1}{n}\frac{e^{\lambda}-1}{e^{\frac{\lambda}{n}}-1} \rightarrow \frac{e^\lambda -1}{\lambda}$ as $\displaystyle n$ tends to $\displaystyle \infty $ which is $\displaystyle E[e^{\lambda X}]$

    Thank you so much for your help Laurent.
    Last edited by yui; Sep 29th 2008 at 01:05 AM. Reason: Latex errors
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. Convergence in Distribution
    Posted in the Advanced Statistics Forum
    Replies: 2
    Last Post: Apr 11th 2010, 04:24 AM
  2. convergence in distribution
    Posted in the Advanced Statistics Forum
    Replies: 7
    Last Post: Dec 29th 2009, 11:49 PM
  3. convergence in distribution
    Posted in the Advanced Statistics Forum
    Replies: 0
    Last Post: Aug 12th 2009, 07:37 AM
  4. Convergence it Distribution
    Posted in the Advanced Statistics Forum
    Replies: 1
    Last Post: Feb 21st 2009, 05:36 PM
  5. Convergence in distribution
    Posted in the Advanced Statistics Forum
    Replies: 1
    Last Post: Jan 19th 2006, 05:39 PM

Search Tags


/mathhelpforum @mathhelpforum