Converge in probability of sample variance

If we let $\displaystyle X_1, X_2, \cdots, X_n$ be independent and identically distributed observations from a population with mean $\displaystyle \mu$ and variance $\displaystyle \sigma^2$ then the weak law of large number states that $\displaystyle \bar{X_n} \rightarrow^p \mu $ and I can prove this part, however does $\displaystyle S^2 \rightarrow^p \sigma^2$? Where $\displaystyle S^2 = \frac{1}{n-1} \sum (X_i - \bar{X})^2$ the sample variance? If so, how to prove it?

Re: Converge in probability of sample variance

Hey usagi_killer.

Maybe what you can do is to look at S^2 in terms of individual expectations. So typically with variance we know Var[X] = E[X^2] - {E[X]}^2 so if you can show that both expectations converge to their respective results (you already know what happens with E[X]) then basically you show that the two converge to what they are meant to and that the whole thing converges to what's it meant to.

For this you will need to correct the estimator so that it's un-biased by the idea is still the same: expand out the expectation of the sample variance and show that it approaches the parameter.