# Math Help - Mean square convergance

1. ## Mean square convergance

Hey, can someone could help me with this proof?
If $(X_{n})_{n in N}, (Y_{n})_{n in N}$ and X,Y are independent random variables, then
$l.i.m (X_{n}\cdot Y_{n})=(l.i.m X_{n}) \cdot (l.i.m. Y_{n})$

2. ## Re: Mean square convergance

Yes, I saw once a proof similar to that. Not quite sure I can get it formally together, but remember independence implies that the covariance is zero. This further, in general, implies mean independence, that is, $E[XY]=E[X]E[Y]$

Now, without going into probability limits, when you just multiply the series by 1/N and take the normal limit, you have by the law of large numbers $lim_{N\rightarrow \infty}XY/N=E[XY]$.

For independence, as shown, it follows that $E[X]E[Y]$ or again expressed in limits $lim_{N\rightarrow \infty}X/N lim_{N\rightarrow \infty}Y/N$