Let be a sequence of independent random variables with and . Show that if and , then in probability.
Thing is, I'm not quite sure how the inequality you mentioned holds? I can see intuitively how it works, but is there a theorem or proof or something for that?
EDIT: Nvm, figured it out. Thanks for everything!
Also, I edited the question, left out the fact that the variances converge (I suppose this is for applying Chebyshev's inequality later.)
I KNEW you left something out!
Second term goes to zero since they are constants and the different goes to zero.
And you square the absolute value and use markov/chebyshev's in the first.
I didn't think this was complete.
Either you needed that the sum of the variances divided by the square of n to zero
OR another way was that all the variances were equal OR the sup of the variances was bounded....
But something was off.