I am a programmer charged with finding the standard deviation of a number. It has been many years since I did any statistics work. I could use some help. Google has me lost...
We have an online questionnaire taken by students. For each question, we have two mean scores. The first is the mean (average) score of all student taking the questionaire. The second is the mean (average) score of a particular sub-set of these same students.
Example numbers. 1st question: I get a mean for my first group of 5.09 and 3.83 for my sub-set group. How do I find the standard deviation of the 3.83 as it relates to 5.09?
If possible, can anyone tell me the steps as I would have to do them on a simple calculator. In code, I cannot do math using symbols like Sigma and and mean... Example of how I tried this.
1) find the mean. I have 5.09. I figured this is then mean I should use.
2) subtract the mean from the first number. -1.26
3) square the result = 1.5876
4) add the results (only one number) so: 1.5876
5) divide by n (1) = 1.5876
6) square the root = 1.26.
For some reason that doesn't seem right. Thoughts?
Thanks for any insite.