I know that the standard deviation of X is sqrt of the variance of X.

How do I find the standard deviation of (aX + b) if X has a variance of σ^2?

Printable View

- November 23rd 2009, 07:29 PMchewitardStandard Deviation question
I know that the standard deviation of X is sqrt of the variance of X.

How do I find the standard deviation of (aX + b) if X has a variance of σ^2? - November 23rd 2009, 10:34 PMmatheagle
do you want a proof?

So,

since