Hello ! I have troubles understanding this statement :

If one, in an attempt to factorize a composite number n, finds two squares a and b at random such as a^2 - b^2 \equiv 0 \pmod{n}, then taking the greatest common divisor of a \pm b with n yields a nontrivial factor of n with probability \frac{1}{2}.
I don't understand why there is 0.5 probability to find a nontrivial factor of n, since if we call p one factor of n, then if we find a nontrivial factor, p must divide a \pm b, and so there are \frac{n}{p} possibilities that this happens, whether if we find a trivial factor of n, then n \ | \ a \pm b, and there are infinitely many numbers a and b that satisfy this condition. So it should be very likely that we find a trivial factor of n with a difference of squares, which is not true.

Can somebody explain to me where I went wrong in my reasoning ? Thanks !