LLN doesn't suit. Yes intuitively the 1/nē is similar to the 1/n in the LLN. But here we're looking for the convergence in probability, which appears in the small law of large numbers, a barely used theorem.
That's why I think that it isn't the LLN we need here.
After wandering, I'm thinking of using Chebychev's & Jensen's inequalities. The problem is that nowhere will I use the fact that it's a normal distribution... If you added the normal distribution by some mistake, please tell
Chebychev's inequality can be used because is an independent sequence with a finite second moment.
Jensen's inequality is usually for convex functions, but for concave functions you just have to reverse the inequality !
So since the square root function is concave, we get that
But we know that the sum the squares of the first n integers when n gets big. So we can say that as n goes to infinity. If you want to prove it properly well you can do the calculations, but I don't think it's necessary.
Hence tends to 0 as n tends to infinity.
Now let's compute
by independence of the random variables.
For a given k,
(I'll let you write the missing steps, I'm not here to do the details )
So here again we have an equivalent in .
Hence as n goes to infinity.
Finally, when .
We're almost there. Now we have to prove that Z converges to 0 in probability, by noting that (and having in mind that a.s. hence in probability)
- Why ? Because by the triangle inequality, if , then . Thus and hence the above inequality. -
Since and are both positive, .
which proves that Z converges to 0 in probability.........