# Confidence intervals and assumption of normality

• May 19th 2011, 06:48 AM
TheRobster
Confidence intervals and assumption of normality
Hello,

I am analysing some data and calculating the 95% confidence intervals with the 'usual' approach of Xbar +/- t * s/sqrt(n).

However I have read several times that one of the assumptions using this method is that the distribution has to be normal (or close to normal). My original data is far from normal and is noticably skewed right. HOWEVER my understanding is that providing my sample size is >30 (in most cases it is) then the sampling distribution will be normal. Since it is the sampling distribution I am using to calculate the confidence interval then the original sample distribution does not have to be normal.

Does anyone know if this is correct? I.e. that as long as n>30 it does not matter what shape the original distribution is since my sampling distribution will be normal (or close to normal) so I can use the above method to calculate the 95% CI's?

Cheers!
Rob
• May 19th 2011, 02:07 PM
pickslides
Hi The Robster

Your data does not have to be normal to apply this confidence interval. Why?

Well the confidence interval you are looking at is for the mean., not the data itself. While the data is not normal it is most likely if you took many sample means (>30) from this data set then the distribution of these sample means would be normal.

Refer to the 'Central Limit Thoerem' to find out more.
• May 20th 2011, 02:32 AM
TheRobster
Thanks. :)