Standardizing A Random Variable

• Apr 17th 2013, 05:04 AM
Bashyboy
Standardizing A Random Variable
Hello,

To standardize a random variable that is normally distributed, it makes absolute sense to subtract the expected value, $\mu$, from each value that the random variable can assume--it shifts all of the values such that the expected value is centered at the origin. But how does dividing by the standard deviation play a role in the standardization of a random variable? That part is not as intuitive to me as is subtracting $\mu$.
• Apr 17th 2013, 08:45 AM
Juju
Re: Standardizing A Random Variable
I am not sure if I got your question right.

A standard normally distributed random variable has variance $1.$

Let us consider a r.v. $Y$ with variance $\sigma^2$ and mean $\mu$.

The r.v. $\frac{1}{\sigma}( Y-\mu)$ is now standard normally distributed (since the expectation is linear). In order to get a variance of $1$ we therefore have to divide by the standard deviation
• May 13th 2014, 12:16 PM
JacobE
Re: Standardizing A Random Variable
Not wanting to create another topic for this, so I'm just going to throw the question out here.
What's the actual unit of measurement that corresponds to a variable that's been standardized?