Originally Posted by

**MathGeek06** A website posts movie ratings which are computed by taking the average of the ratings posted by users on the website. Each user rates a particular movie on a scale of 1 to 10. Assume that the ratings by the users are random variables which are identically and independently distributed. The variance of this distribution is known to be 1.5. Each movie is typically rated by about 100,000 users. The latest blockbuster comes out on Friday and the website wants to post the rating for the movie as soon as possible and does not want to wait till all its users post their ratings.

a) What is the minimum number of ratings that it needs so that the average calculated from this sample will be within 0.2 of the actual mean with 95% probability?

b) You are told that for this movie, the mean of the individual random variables is 8.5. What is the number of samples you will take so that the average rating of the sample is above 8.1 with 80% probability ?