# sample size

• May 8th 2008, 01:07 AM
AnnM
sample size
I just got a strange result...

1. I have the standard deviation = 200
2. I know the mean of the sample is at most 100 more than the mean of the population.

I must find the size of the sample with 95% confidence.

I thought that #2 meant the standard error for the mean was 100;

$\displaystyle s_{\bar{x}} = \frac{s}{\sqrt{n}}$

$\displaystyle 100 = \frac{200}{\sqrt{n}}$

$\displaystyle n = 4$

Also, with this method I can't get any confidence interval.
• May 8th 2008, 11:15 AM
CaptainBlack
Quote:

Originally Posted by AnnM
I just got a strange result...

1. I have the standard deviation = 200
2. I know the mean of the sample is at most 100 more than the mean of the population.

I must find the size of the sample with 95% confidence.

I thought that #2 meant the standard error for the mean was 100;

$\displaystyle s_{\bar{x}} = \frac{s}{\sqrt{n}}$

$\displaystyle 100 = \frac{200}{\sqrt{n}}$

$\displaystyle n = 4$

Also, with this method I can't get any confidence interval.

Suppose the samplw size is $\displaystyle n$, the assume large sample statistics so:

$\displaystyle m\sim N(\mu,\sigma^2/n)$

Now our confidence interval $\displaystyle [n_1,n_2]$ is chosen so that for any $\displaystyle n \in [n_1,n_2]$

$\displaystyle p(m>\mu+100|n)<0.05$

That is for any $\displaystyle n \in [n_1,n_2]$ the probability of getting a result worse than that actually observed is less than $\displaystyle 5\%$.

RonL
• May 8th 2008, 10:20 PM
AnnM

I have the standard deviation of the sample, s = 200

The mean of the sample is at most 100 more than the mean of the population. I interpret that as = the standard error of the mean is 100.

The equation for the standard deviation for the mean is

$\displaystyle s_{\bar{x}} = \frac{s}{\sqrt{n}}$

...but it seems that I can't extract n from there (well, I can with a little algebra but the answer makes no sense). Why ? Beside, even if I could get 'n' from the equation, I would have no interval of confidence.
• May 8th 2008, 10:43 PM
CaptainBlack
Quote:

Originally Posted by AnnM

I have the standard deviation of the sample, s = 200

The mean of the sample is at most 100 more than the mean of the population. I interpret that as = the standard error of the mean is 100.

The equation for the standard deviation for the mean is

$\displaystyle s_{\bar{x}} = \frac{s}{\sqrt{n}}$

...but it seems that I can't extract n from there (well, I can with a little algebra but the answer makes no sense). Why ? Beside, even if I could get 'n' from the equation, I would have no interval of confidence.

$\displaystyle p(m>\sigma+100|n)<0.05$

should be:

$\displaystyle p(m>\mu+100|n)<0.05$

Which gives us a nicer condition:

$\displaystyle p(m-\mu >100|n)<0.05$

$\displaystyle p\left(\frac{m-\mu}{\sqrt{n} \sigma} >\frac{1}{2\sqrt{n}}\right)<0.05$

RonL
• May 9th 2008, 07:57 AM
AnnM
I don't understand why my reasoning doesn't work.

I have no idea how you got to your equation.

And even if I did, we don't have the information to solve it.

In short, the equation for the interval should be;

$\displaystyle \left[\bar{x} - 1.96\frac{200}{\sqrt{n}};\bar{x} + 1.96\frac{200}{\sqrt{n}}\right]$

But, as I don't have the mean, I can't get the sample size.
• May 9th 2008, 10:39 AM
AnnM
In fact, I got;

$\displaystyle n = \left[\frac{Z_{a/2}s}{s_x}\right]^2 = \left[\frac{1.96(200)}{100}\right]^2 = 15.37$

It doesn't seem to make sense either...
• May 9th 2008, 01:08 PM
TKHunny
d = 100

Why doesn't n = 16 make sense?

If only we had the entire problem statement...
• May 12th 2008, 07:22 AM
AnnM
Quote:

Originally Posted by TKHunny
Why doesn't n = 16 make sense?

Because;

$\displaystyle$
$\displaystyle s_{\bar{x}} = \frac{s}{\sqrt{n}}$