On a standard IQ test, the scores are normally distributed with a mean of 100 and a standard deviation of 20.
a. if you score 140, in what percentile are you?
b. if you score 90, in what percentile are you?
So i'm pretty sure that i've got the idea behind doing questions like this however to get the final answer i tend to get lost. this is what i'm doing.
x bar = 100 and σ = 20
140 is 40 marks above x bar.
140 is 40/20σ or 2σ above x bar.
now to find the area below 2σ above x bar is where i'm not sure what to do next. Help in this area would be much appreciated.
Since the mean is 100 and the SD is 20. Note for 140 IQ score, you are 2
standard deviations above the mean.
That means you scored higher than 97.5% of the people who took the IQ
That is 97.5% of the area under the normal curve. Remember the Empirical
That is, 68% of the data is within one SD, 95% of the data is within 2 SD,
and 99.7% of the data is within 3 SD. Therefore, 50+34+13.5=97.5.
Suppose someone had a 180 IQ. That means they would be 4 SD above
the mean. That would correspond to .999968328283
That means they would score higher than 99.9968% of the others.
If 1,000,000 took the test, that means only about 32 out of 1 million
would score as high or higher.
For your 140 score, if 1 million took the test, about 25,000 out of the
million would score as higher or higher than 140.
See now a wee bit better?.
this makes a little more sense. however i still don't fully understand it, is there some sort of formula where x bar = ? + ? to give you the percentile? it's which numbers to use to get the percentile that messes me up. looking at what you put down i don't understand where the 50, 34 and 13.5 come from?