I know that the standard deviation of X is sqrt of the variance of X.

How do I find the standard deviation of (aX + b) if X has a variance of σ^2?

Printable View

- Nov 23rd 2009, 07:29 PMchewitardStandard Deviation question
I know that the standard deviation of X is sqrt of the variance of X.

How do I find the standard deviation of (aX + b) if X has a variance of σ^2? - Nov 23rd 2009, 10:34 PMmatheagle
$\displaystyle V(aX+b)=a^2V(X)$ do you want a proof?

So, $\displaystyle \sigma_{aX+b}=|a|\sigma_X$

since $\displaystyle \sqrt{a^2}=|a|$