Let be a sequence of independent random variables with and . Show that if and , then in probability.

Printable View

- Feb 4th 2010, 03:25 PMh2ospreyCentral Limit Theorem
Let be a sequence of independent random variables with and . Show that if and , then in probability.

- Feb 4th 2010, 09:47 PMmatheagle
- Feb 6th 2010, 10:10 PMh2osprey
Ah I suppose after that you just apply Chebyshev's inequality and the proof is done.

Thing is, I'm not quite sure how the inequality you mentioned holds? I can see intuitively how it works, but is there a theorem or proof or something for that?

**EDIT: Nvm, figured it out. Thanks for everything!**

Also, I edited the question, left out the fact that the variances converge (I suppose this is for applying Chebyshev's inequality later.) - Feb 6th 2010, 10:15 PMmatheagle
I KNEW you left something out!

Second term goes to zero since they are constants and the different goes to zero.

And you square the absolute value and use markov/chebyshev's in the first.

I didn't think this was complete.

Either you needed that the sum of the variances divided by the square of n to zero

OR another way was that all the variances were equal OR the sup of the variances was bounded....

But something was off. - Feb 6th 2010, 10:25 PMmatheagle
- Feb 7th 2010, 09:14 AMh2osprey
Yup that's what I got thanks a lot for the help!

Sorry for the missing variances thing.. it was part of the previous question so I left it out by accident (Itwasntme) - Feb 7th 2010, 01:50 PMmatheagle
I couldn't finish your problem since it didn't seem correct, without some more info.

That's why I stopped after using the triangle inequality.