Hello ! I have troubles understanding this statement :

I don't understand why there is 0.5 probability to find a nontrivial factor of $\displaystyle n$, since if we call $\displaystyle p$ one factor of $\displaystyle n$, then if we find a nontrivial factor, $\displaystyle p$ must divide $\displaystyle a \pm b$, and so there are $\displaystyle \frac{n}{p}$ possibilities that this happens, whether if we find a trivial factor of $\displaystyle n$, then $\displaystyle n \ | \ a \pm b$, and there are infinitely many numbers $\displaystyle a$ and $\displaystyle b$ that satisfy this condition. So it should be very likely that we find a trivial factor of $\displaystyle n$ with a difference of squares, which is not true.If one, in an attempt to factorize a composite number $\displaystyle n$, finds two squares $\displaystyle a$ and $\displaystyle b$ at random such as $\displaystyle a^2 - b^2 \equiv 0 \pmod{n}$, then taking the greatest common divisor of $\displaystyle a \pm b$ with $\displaystyle n$ yields a nontrivial factor of n with probability $\displaystyle \frac{1}{2}$.

Can somebody explain to me where I went wrong in my reasoning ? Thanks !