Results 1 to 2 of 2

Math Help - GCD

  1. #1
    Newbie
    Joined
    Sep 2008
    Posts
    4

    GCD

    Hi, I am a little confused on this problem. Please help!!

    Assuming gcd(a,b)=1, prove that gcd(a+b, a^2+b^2)=1 or 2

    So far I know a^2 + b^2= (a+b)(a-b)+2b^2
    Follow Math Help Forum on Facebook and Google+

  2. #2
    MHF Contributor

    Joined
    Aug 2008
    From
    Paris, France
    Posts
    1,174
    Let d be a (positive) common divider of a^2+b^2 and of a+b. You wrote a^2+b^2=(a+b)(a-b)+2b^2, so d divides both the left-hand side and the first term on the right side, which implies that d|2b^2 as well.

    Suppose 2 does not divide d. Then 2 and d are relatively prime (using the fact that 2 is prime), so that d|b^2. Remembering that d|a^2+b^2, we deduce d|a^2. So d|b^2 and d|a^2. However, a and b are relatively prime, so that a^2 and b^2 are relatively prime as well, and their only positive common divider is 1. So d=1.

    Suppose now that 2 divides d. Then, letting d=2 d', we have d'|b^2 and d'|d|a^2+b^2, so d'|a^2. Again, because \gcd(a^2,b^2)=1, we get d'=1, so that d=2.

    We have shown that the only common dividers d of a^2+b^2 and a+b are either 1 or 2. This implies that \gcd(a^2+b^2,a+b) is either 1 or 2.

    This was a nice problem, thank you
    Laurent.
    Follow Math Help Forum on Facebook and Google+

Search Tags


/mathhelpforum @mathhelpforum