Results 1 to 2 of 2

Math Help - Algorithim

  1. #1
    Newbie
    Joined
    Apr 2009
    Posts
    14

    Algorithim

    Let a, b be positive integers and set d = Gcd(a, b).
    (a) Prove d|Gcd((a^2) + (b^2), a + b).
    (b) If a and b are odd, then prove 2d|Gcd((a^2) + (b^2), a + b).

    I tried to solve it but I couldn't
    Follow Math Help Forum on Facebook and Google+

  2. #2
    Newbie
    Joined
    Apr 2009
    Posts
    23
    Since d divides both a and b, there are integers k and l s.t.
    a = kd and b = ld.
    Thus a^2 + b^2 = (k^2+l^2)d^2
    and a+b = (k+l) d
    Thus, the gcd of a^2+b^2 and a+b is at least d, which is a)

    Now, for b), we assume that a and b are both odd.
    Then k and l are odd, too.
    Thus k^2+l^2 is even (odd + odd is odd and odd + odd is even)
    and k+l is even.
    Thus 2d divides the gcd of a^2+b^2 and a+b in this case.

    Best,

    ZD
    Follow Math Help Forum on Facebook and Google+

Search Tags


/mathhelpforum @mathhelpforum