Originally Posted by

**chengbin** Given that random variable $\displaystyle X$, its mean value $\displaystyle \mu$, its variance $\displaystyle \sigma ^2$, and the mean value $\displaystyle X^2$, which is $\displaystyle \mu _{X^2}$, prove that $\displaystyle \sigma ^2=\mu _{X^2} - \mu _X^2$.

Letting X be the values $\displaystyle x_1,x_2,...,x_n$ and $\displaystyle P(X = x_i) =p_i$, $\displaystyle (i=1,2,...,n)$

$\displaystyle \therefore \sigma ^2 = \sum_{i=1}^n (x_i-\mu _x)^2 \cdot p_i$

$\displaystyle = \sum_{i=1}^n x_i^2 \cdot p_i - 2\mu _X \cdot \bigg (\sum_{i=1}^n x_i \cdot p_i \bigg ) + \mu _X^2 \bigg ( \sum_{i=1}^n p_i \bigg ) $ I'm pretty lost how to get this step Mr F says: Just expand!

and the rest. Mr F says: By definition: $\displaystyle {\color{red}\sum_{i=1}^n p_i = 1}$, $\displaystyle {\color{red}\sum_{i=1}^n x_i p_i = \mu_X}$ and $\displaystyle {\color{red}\sum_{i=1}^n x^2_i p_i = \mu_{X^2}}$.

$\displaystyle =\mu_{X^2}-2\mu _X \cdot \mu _X + \mu _X^2$

$\displaystyle =\mu _{X^2} - \mu _X^2$

I should have taken a picture of the book. That was a ton of code to write, even though it doesn't look like much at all.