# Thread: 2 convergence in distribution questions

1. ## 2 convergence in distribution questions

First one:
Suppose $X_n \sim Bin(n, p)$. Show that as $n \rightarrow \infty$
$
\frac{X^2_n - n^2 p^2}{2(pn)^\frac{3}{2} \sqrt{1-p})} \stackrel{D}{\rightarrow} Z \sim N(0,1)
$

In a previous question I proved that $\frac{X_n - np}{\sqrt{np(1-p)}} \stackrel{D}{\rightarrow} Z \sim N(0,1)$, so for this one, do I just have to show that the left hand side is $X^2_n$ and that from the continuous mapping theorem, it also converges in distribution to Z as well. But I got stuck on showing that the left hand side is $X^2_n$ so I'm not sure if that's the way to go.

Second one:
Show that $(1-X_n)^{-1} \stackrel{D}{\rightarrow} (1-X)^{-1}$ given that $X_n \sim P(X_n = \frac{i}{n}) = \frac{1}{n}, i = 0, 1, 2, 3, .. n-1$
I tried to prove it using the MGF of Xn but I got stuck on the algebra after
$
E(e^{\lambda X_n}) = \sum_{x=\frac{i}{n}}^{\frac{n-1}{n}} e^{\lambda x} P(X=x)
$

I'm not even sure the sum is right since I didn't write anything down for i. Is using the MGF the way to go or should I go back to the definition or use some other method?

2. For your first question, the key trick is to write $X_n^2-n^2p^2=(X_n-np)(X_n+np)$. You know that $X_n-np$, suitably normalized, converges in distribution. As for $X_n+np$, the LLN shows that $\frac{X_n+np}{np}=\frac{X_n}{np}+1$ converges almost-surely (and hence in distribution) to 2.

For the second one, do you have an intuition of what $X_n$ looks like? Once you have it, prove that $(X_n)_n$ converges in distribution to you-guessed-what, using the cumulative distribution function (which I prefer) or the MGF. Then deduce the result about $\frac{1}{1-X}$.

By the way, the MGF is $E[e^{\lambda X_n}]=\sum_{i=0}^{n-1}e^{\lambda\frac{i}{n}}P(X_n=\frac{i}{n})=\sum_{i =0}^{n-1}e^{\lambda\frac{i}{n}}\frac{1}{n}$. You can simplify this expression (sum of $n$ terms in a geometric sequence...).

3. Ok so for the first one:
$
\frac{X^2_n - n^2 p^2}{2(pn)^\frac{3}{2} \sqrt{1-p})} = \frac{X_n - np}{\sqrt{np(1-p)}} \frac{X_n + np}{np} \frac{1}{2}
$

$
\frac{X_n+np}{np} = \frac{X_n}{np} +1
$
converges to 2
In a previous question I found that $\frac{X_n - np}{\sqrt{np(1-p)}}$ converged in distribution to N(0,1), then the above equation converges to N(0,1) as well.

I'm still a bit lost on the second question though.
Xn is discrete normal (?) from 0-1 with probability 1/n then X is continuous normal from 0-1 as well.
$
E[e^{\lambda X_n}] = \sum_{i=0}^{n-1} e^{\lambda\frac{i}{n}}\frac{1}{n} = \frac{n-1}{n} \sum_{i=0}^{n-1} e^{\lambda\frac{i}{n}}
$

$
E[e^{\lambda X}] = \int_{0}^{1} e^{\lambda X} dx = \frac{e^{\lambda x} - 1}{\lambda}
$

Not sure what to do with the sigma part in the first MGF still.

4. Second part: $X_n$ is chosen uniformly in $\{0,\frac{1}{n},\ldots,\frac{n-1}{n}\}$. As $n$ tends to $+\infty$, $X_n$ looks more and more like it were uniformly distributed on $[0,1]$, doesn't it? There's no normal distribution here.

$E[e^{\lambda X_n}] = \sum_{i=0}^{n-1} e^{\lambda\frac{i}{n}}\frac{1}{n} = \frac{1}{n}\sum_{i=0}^{n-1}\left(e^{\frac{\lambda}{n}}\right)^i=\frac{1}{n} \frac{e^{\lambda}-1}{e^{\frac{\lambda}{n}}-1}$ and $e^{\frac{\lambda}{n}}-1\sim \frac{\lambda}{n}$ as $n$ tends to $\infty$...

5. Oops, I meant discrete and continuous uniform, must've been mixed up from the first question

$\frac{1}{n}\frac{e^{\lambda}-1}{e^{\frac{\lambda}{n}}-1} \rightarrow \frac{e^\lambda -1}{\lambda}$ as $n$ tends to $\infty$ which is $E[e^{\lambda X}]$

Thank you so much for your help Laurent.