I know that the standard deviation of X is sqrt of the variance of X. How do I find the standard deviation of (aX + b) if X has a variance of σ^2?
Follow Math Help Forum on Facebook and Google+
do you want a proof? So, since
View Tag Cloud