I just got a strange result...
1. I have the standard deviation = 200
2. I know the mean of the sample is at most 100 more than the mean of the population.
I must find the size of the sample with 95% confidence.
I thought that #2 meant the standard error for the mean was 100;
But it leads to;
Also, with this method I can't get any confidence interval.
Thank you for your answer, but I don't really get your notation.
I have the standard deviation of the sample, s = 200
The mean of the sample is at most 100 more than the mean of the population. I interpret that as = the standard error of the mean is 100.
The equation for the standard deviation for the mean is
...but it seems that I can't extract n from there (well, I can with a little algebra but the answer makes no sense). Why ? Beside, even if I could get 'n' from the equation, I would have no interval of confidence.
I don't understand why my reasoning doesn't work.
I have no idea how you got to your equation.
And even if I did, we don't have the information to solve it.
In short, the equation for the interval should be;
But, as I don't have the mean, I can't get the sample size.