Let a, b be positive integers and set d = Gcd(a, b).
(a) Prove d|Gcd((a^2) + (b^2), a + b).
(b) If a and b are odd, then prove 2d|Gcd((a^2) + (b^2), a + b).
I tried to solve it but I couldn't
Since d divides both a and b, there are integers k and l s.t.
a = kd and b = ld.
Thus a^2 + b^2 = (k^2+l^2)d^2
and a+b = (k+l) d
Thus, the gcd of a^2+b^2 and a+b is at least d, which is a)
Now, for b), we assume that a and b are both odd.
Then k and l are odd, too.
Thus k^2+l^2 is even (odd + odd is odd and odd + odd is even)
and k+l is even.
Thus 2d divides the gcd of a^2+b^2 and a+b in this case.