Originally Posted by

**emersong** I am having a little problem understanding the difference (If there is one.) between an event being ‘Rare’ versus the probability of that event happing.

For example;

The English lottery has 6 numbers (ignoring the bonus ball for now.) If I take the average of all possible combinations of 6 numbers it forms a normal distribution, with the mean of the population being 25 and the standard deviation of about 5. (5.4645)

I deduce from this that if I take a random sample (i.e. pick any of the previous week’s lottery results at random.) and take the average of the 6 numbers, 95% of the time the average should be between 15 and 35.

From the above information, the numbers {1,2,3,4,5,6} who’s average is 3.5, would be a very rare occurrence. Yet I know that they have the same probability of being drawn as any other 6 numbers.

So my question is;

If I have a process at work for example, that has a mean of say 50, a standard deviation of 0.5 and it is normally distributed, if I were take 1 random sample out of that process I am guessing that it would be a rare occurrence for that sample to measure 51.5

Assuming that the various factors that control the output of the process have an equal chance of happening, would that 1 sample that measured 51.5 have the same probability of being picked out as any other sample? Even though I know that 95% of the time the sample was likely to be between 49 and 51.

p.s.

Just out of curiosity, if everyone who played the lottery picked numbers whose average was between 15 and 35; would there be a lot more winners??!!