Standardizing A Random Variable

Hello,

To standardize a random variable that is normally distributed, it makes absolute sense to subtract the expected value, $\displaystyle \mu$, from each value that the random variable can assume--it shifts all of the values such that the expected value is centered at the origin. But how does dividing by the standard deviation play a role in the standardization of a random variable? That part is not as intuitive to me as is subtracting $\displaystyle \mu$.

Re: Standardizing A Random Variable

I am not sure if I got your question right.

A standard normally distributed random variable has variance $\displaystyle 1.$

Let us consider a r.v. $\displaystyle Y$ with variance $\displaystyle \sigma^2$ and mean $\displaystyle \mu$.

The r.v. $\displaystyle \frac{1}{\sigma}( Y-\mu)$ is now standard normally distributed (since the expectation is linear). In order to get a variance of $\displaystyle 1$ we therefore have to divide by the standard deviation

Re: Standardizing A Random Variable

Not wanting to create another topic for this, so I'm just going to throw the question out here.

What's the actual unit of measurement that corresponds to a variable that's been standardized?