1. ## GCD Proof

Let a, b, c be integers. If gcd(a, b) = 1 and c | (a+b), show that gcd(a, c) = 1.

>>If gcd(a, b) = 1, then (a+b) shares no common factors with a or b, besides 1.
For this reason, c, a factor of (a+b), cannot share any common factors with a or b, besides 1.
Thus, gcd(a, c) = 1.

That is how I figured this proof, but I'm worried I'll lose marks for not throughly explaining how the >>ed line is true. Try as I might, I haven't thought of a good way of proving it. I came to that conclusion inherently, but I know it's right...Any hints to a way in proving it?

Also, my apologies if this is the wrong forum, I wasn't sure if this question was tough enough to get into the other algebra forum.

2. Originally Posted by BlackBlaze
Let a, b, c be integers. If gcd(a, b) = 1 and c | (a+b), show that gcd(a, c) = 1.

>>If gcd(a, b) = 1, then (a+b) shares no common factors with a or b, besides 1.
For this reason, c, a factor of (a+b), cannot share any common factors with a or b, besides 1.
Thus, gcd(a, c) = 1.

That is how I figured this proof, but I'm worried I'll lose marks for not throughly explaining how the >>ed line is true. Try as I might, I haven't thought of a good way of proving it. I came to that conclusion inherently, but I know it's right...Any hints to a way in proving it?

Also, my apologies if this is the wrong forum, I wasn't sure if this question was tough enough to get into the other algebra forum.
this is in the wrong forum. see here

ask questions there if you need clarification. i am going to close this thread since it is the same question