Hey, can someone could help me with this proof?

If $\displaystyle (X_{n})_{n in N}, (Y_{n})_{n in N} $ and X,Y are independent random variables, then

$\displaystyle l.i.m (X_{n}\cdot Y_{n})=(l.i.m X_{n}) \cdot (l.i.m. Y_{n})$

Printable View

- Mar 22nd 2013, 09:16 AMKenjiMean square convergance
Hey, can someone could help me with this proof?

If $\displaystyle (X_{n})_{n in N}, (Y_{n})_{n in N} $ and X,Y are independent random variables, then

$\displaystyle l.i.m (X_{n}\cdot Y_{n})=(l.i.m X_{n}) \cdot (l.i.m. Y_{n})$ - Mar 23rd 2013, 03:07 PMButterflyMRe: Mean square convergance
Yes, I saw once a proof similar to that. Not quite sure I can get it formally together, but remember independence implies that the covariance is zero. This further, in general, implies mean independence, that is, $\displaystyle E[XY]=E[X]E[Y]$

Now, without going into probability limits, when you just multiply the series by 1/N and take the normal limit, you have by the law of large numbers $\displaystyle lim_{N\rightarrow \infty}XY/N=E[XY]$.

For independence, as shown, it follows that $\displaystyle E[X]E[Y]$ or again expressed in limits $\displaystyle lim_{N\rightarrow \infty}X/N lim_{N\rightarrow \infty}Y/N$