# Thread: The logic of standard error

1. ## The logic of standard error

I hope someone knows how the logic of this equation works.

In finding the standard error in a single sample t-test, why is n square-rooted? What does square rooting this number accomplish as a concept?

I don't have a problem using the equation, but I feel like a robot just doing it and not knowing why.

I want to understand.

2. ## Re: The logic of standard error

Ok, I don't know the exact answer to this, I suggest google might know, but if you consider the relationship between the variance $\displaystyle s^2$ and the standard deviation $\displaystyle s$ then also

$\displaystyle \frac{s^2}{n}$ and $\displaystyle \frac{s}{\sqrt{n}}$

3. ## Re: The logic of standard error

I suppose at deepest level, we use $\displaystyle \sqrt n$ so that everything is scaled properly. If you use something larger, like $\displaystyle n$ then as $\displaystyle n \to \infty$ we will end up with our test statistic going to $\displaystyle 0$, regardless of whether null hypothesis is true or not, instead having a T-distribution.

Why does $\displaystyle \sqrt n$ accomplish this? Well, because that is the rate at which the standard deviation of a sum increases. For example, if $\displaystyle X_1, ..., X_n$ are independent from a normal distribution with mean $\displaystyle \mu$ and variance $\displaystyle \sigma^2$ then $\displaystyle \mbox{Var}[\sum X_i] = n\sigma^2$ so that the standard deviation is $\displaystyle \sqrt n \sigma$. So, we have the standard deviation of a sum increasing at a rate of $\displaystyle \sqrt n$ - or equivalently the standard deviation of $\displaystyle \bar X$ is decreasing at a rate of $\displaystyle \sqrt n$; scaling by $\displaystyle \sqrt n$ makes it so that our standard deviation (of the test statistic) is going neither to 0 nor to $\displaystyle \infty$, but is staying constant.

4. ## Re: The logic of standard error

Originally Posted by Guy
Why does $\displaystyle \sqrt n$ accomplish this? Well, because that is the rate at which the standard deviation of a sum increases. For example, if $\displaystyle X_1, ..., X_n$ are independent from a normal distribution with mean $\displaystyle \mu$ and variance $\displaystyle \sigma^2$ then $\displaystyle \mbox{Var}[\sum X_i] = n\sigma^2$ so that the standard deviation is $\displaystyle \sqrt n \sigma$. So, we have the standard deviation of a sum increasing at a rate of $\displaystyle \sqrt n$ - or equivalently the standard deviation of $\displaystyle \bar X$ is decreasing at a rate of $\displaystyle \sqrt n$; scaling by $\displaystyle \sqrt n$ makes it so that our standard deviation (of the test statistic) is going neither to 0 nor to $\displaystyle \infty$, but is staying constant.
Reading this was like drinking fresh, cold water in a desert. I can't tell you how refreshing it is to understand that concept; our professor's interests gravitate more toward making sure we all know how to interpret the final answer in the big picture, rather than understand the mechanics that make it the correct answer to begin with.

Thanks muchly!