Originally Posted by
essedra A radar tends to overestimate the distance of an aircraft, and the error is a normal random variable with a mean of 89 meters and a standard deviation 190 meters.
What is the probability that the measured distance will be smaller than the true distance?
I interpreted the question as this: what is the probability that the "overestimate" is less than zero? (Do you think I interpreted it right?)
Then, I expressed the difference between the mean (89) and 0 as a number of standard deviations:
89/190= 0,4684
but now I'm stuck... Can anyone help me to do the rest of the question?
Appreciate any responds.