# Standard Deviation question

• November 23rd 2009, 07:29 PM
chewitard
Standard Deviation question
I know that the standard deviation of X is sqrt of the variance of X.

How do I find the standard deviation of (aX + b) if X has a variance of σ^2?
• November 23rd 2009, 10:34 PM
matheagle
$V(aX+b)=a^2V(X)$ do you want a proof?

So, $\sigma_{aX+b}=|a|\sigma_X$

since $\sqrt{a^2}=|a|$