Standardizing A Random Variable
To standardize a random variable that is normally distributed, it makes absolute sense to subtract the expected value, , from each value that the random variable can assume--it shifts all of the values such that the expected value is centered at the origin. But how does dividing by the standard deviation play a role in the standardization of a random variable? That part is not as intuitive to me as is subtracting .
Re: Standardizing A Random Variable
I am not sure if I got your question right.
A standard normally distributed random variable has variance
Let us consider a r.v. with variance and mean .
The r.v. is now standard normally distributed (since the expectation is linear). In order to get a variance of we therefore have to divide by the standard deviation