Given a random variable X with distribution N(mu,225) how can I find the sample size n required for a 95% confidence interval for the mean mu with an error of at most epsilon=0.01?
The only formula I can find is n=(z^2*s^2)/.25 Can I somehow use the population variance (take the square root of 225) and substitute it into an equation? I hope someone can help me. My search through websites has only made it worse. THANKYOUUUU!