Originally Posted by

**Guy** Why does $\displaystyle \sqrt n$ accomplish this? Well, because that is the rate at which the standard deviation of a sum increases. For example, if $\displaystyle X_1, ..., X_n$ are independent from a normal distribution with mean $\displaystyle \mu$ and variance $\displaystyle \sigma^2$ then $\displaystyle \mbox{Var}[\sum X_i] = n\sigma^2$ so that the standard deviation is $\displaystyle \sqrt n \sigma$. So, we have the standard deviation of a sum increasing at a rate of $\displaystyle \sqrt n$ - or equivalently the standard deviation of $\displaystyle \bar X$ is decreasing at a rate of $\displaystyle \sqrt n$; scaling by $\displaystyle \sqrt n$ makes it so that our standard deviation (of the test statistic) is going neither to 0 nor to $\displaystyle \infty$, but is staying constant.