1. ## Determining Limiting Distributions

1. Let Xn denote the mean of a random sample of size n from a distribution that is $\displaystyle N(\mu, \sigma ^2)$. Find the Limiting distribution of Xn.

2. Let Y1 denote the first order statistic of a random sample of size n from a distribution that has the PDF $\displaystyle f(x) = e^{-(x - \theta)} \ \ \theta < x$. Let $\displaystyle Z_n = n(Y_1 - \theta)$. Find the distribution of Zn.

1.

So to find the limiting distribution, I need to take the limit of the PDF as n approaches infinity.

$\displaystyle X_n = \frac{X_1 + X_2 + ... + X_n}{n} = \frac{nX_n}{n} = X_n$

Kinda stuck here.

Any help would be appreciated.

2. Can't manipulate RV like that when it comes to distributions. Best way is to go through the moment generating function. You should expect to get another normal distribution $\displaystyle N(\mu, \sigma ^2/n)$.

3. Originally Posted by statmajor
1. Let Xn denote the mean of a random sample of size n from a distribution that is $\displaystyle N(\mu, \sigma ^2)$. Find the Limiting distribution of Xn.

2. Let Y1 denote the first order statistic of a random sample of size n from a distribution that has the PDF $\displaystyle f(x) = e^{-(x - \theta)} \ \ \theta < x$. Let $\displaystyle Z_n = n(Y_1 - \theta)$. Find the distribution of Zn.

1.

So to find the limiting distribution, I need to take the limit of the PDF as n approaches infinity.

$\displaystyle X_n = \frac{X_1 + X_2 + ... + X_n}{n} = \frac{nX_n}{n} = X_n$

Kinda stuck here.

Any help would be appreciated.
1. It makes absolutely no sense to use the symbol $\displaystyle X_n$ twice in the same equation but to mean two different things.

2. The distribution of $\displaystyle U = X_1 + X_2 + ... + X_n$ is not $\displaystyle n X_n$. Review how to find the distribution of a sum of iid normal random variables.

4. So: $\displaystyle E[e^{Ut}]$

where $\displaystyle U = X_1 + X_2 + ... + X_n$?

5. Originally Posted by statmajor
So: $\displaystyle E[e^{Ut}]$

where $\displaystyle U = X_1 + X_2 + ... + X_n$?
No.

Post #2 told you the distribution. Your job is to understand why this is the distribution. Your job is also to review how to get the moment generating function of a sum of independent random variables.

Then you have to take the limit of the distribution as n --> +oo. The result is unsurprising (Google Dirac delta function).

This question has uncovered several apparent fundamental gaps in your knowledge and understanding. Some of these gaps have been pointed out and you must now take the necessary steps to address this.

6. Originally Posted by statmajor
So: $\displaystyle E[e^{Ut}]$

where $\displaystyle U = X_1 + X_2 + ... + X_n$?
Okay, but you lack 1/n factor for U. You also need to do further job than that. Evaluate that expected value for a start and then do what the guy above suggested.

7. The sum of n random variables is not n times one of them.
THIS is what you meant...

$\displaystyle \overline X_n = \frac{X_1 + X_2 + ... + X_n}{n}\ne X_n$

8. Let me give this another try:

$\displaystyle E(e^{tX_n}) = E(e^{t \frac{X_1 + ... + X_n}{n}}) = \prod E(e^{t\frac{X_i}{n}})= (e^{\mu t + 0.5 \frac{\sigma^2}{n}t^2})^n$

and I would take the limit of $\displaystyle (e^{\mu t + 0.5 \frac{\sigma^2}{n}t^2})^n$ as n approaches infinity (which would equal infinity)?

or did I make another dumb mistake somewhere?

9. The sample mean converges almost sure to $\displaystyle \mu$ almost surely.
All you need is a finite first moment and in this case you even have a finite second moment.
Normality is not necessary.

10. but is $\displaystyle (e^{\mu t + 0.5 \frac{\sigma^2}{n}t^2})^n$ correct?

11. substitute t/n for t in the MGF
that will fix this

Originally Posted by statmajor
Let me give this another try:

$\displaystyle E(e^{tX_n}) = E(e^{t \frac{X_1 + ... + X_n}{n}}) = \prod E(e^{{t\over n}X_i})= (e^{\mu {t\over n} + 0.5 \frac{\sigma^2}{n^2}t^2})^n\to e^{\mu t}$

or did I make another dumb mistake somewhere?

12. That makes senses.

Thank you (and everyone else) for all their help