If we have a random number A taken from a normal distribution with mean M1 and variance V1 and a number B taken from a normal distribution with mean M2 and variance V2, what is the probability that A > B?
A link to an online answer would do fine- I'm sure its a standard problem, I just don't know how to look it up.
1 - cumul( -1 * (M1-M2)/sqrt(V1+V2) )
As an example if M1 = 10 and M2 = 11 and V1=V2=1 then my formula says the answer is 0.24
But doing a simulation with a large number of samples from these distributions I get an answer of 0.11
Clearly I've made some mistake - which one is right?