Originally Posted by

**mickreiss** I have a library function cumul(x) which gives the cumulative area of a normal distribution of zero mean and unit variance where the argument is the number of standard deviations away from the centre. I.e. cumul(-infinity) = 0, cumul(0) = 0.5 and cumul(infinity) = 1. By my calculations your suggestion means the the probability of A being greater than B is:

1 - cumul( -1 * (M1-M2)/sqrt(V1+V2) )

As an example if M1 = 10 and M2 = 11 and V1=V2=1 then my formula says the answer is 0.24

But doing a simulation with a large number of samples from these distributions I get an answer of 0.11

Clearly I've made some mistake - which one is right?

Your MC is wrong.

Code:

>N=10000;
>
>
>
>A=normal(1,N)*1+10;B=normal(1,N)*1+11;
>
>sum(A>B)/N
0.2407
>

CB