I know that the standard deviation of X is sqrt of the variance of X. How do I find the standard deviation of (aX + b) if X has a variance of σ^2?
Follow Math Help Forum on Facebook and Google+
$\displaystyle V(aX+b)=a^2V(X)$ do you want a proof? So, $\displaystyle \sigma_{aX+b}=|a|\sigma_X$ since $\displaystyle \sqrt{a^2}=|a|$
View Tag Cloud