Hello ! I have troubles understanding this statement :

I don't understand why there is 0.5 probability to find a nontrivial factor of , since if we call one factor of , then if we find a nontrivial factor, must divide , and so there are possibilities that this happens, whether if we find a trivial factor of , then , and there are infinitely many numbers and that satisfy this condition. So it should be very likely that we find a trivial factor of with a difference of squares, which is not true.If one, in an attempt to factorize a composite number , finds two squares and at random such as , then taking the greatest common divisor of with yields a nontrivial factor of n with probability .

Can somebody explain to me where I went wrong in my reasoning ? Thanks !