# GCD=1 Problem

• February 5th 2007, 07:51 AM
GCD=1 Problem
Let r,s,t be integers. If r=st+1, prove gcd(r,s) = 1.

My works so far:

I know that to prove gcd(r,s) =1, I need to have ra + sb = 1 for some integers a and b.

Now r = st + 1 means 1 = r - st.

Well, r = r(1) and - st means + s(-b)

So 1 = r(1) + s (-t)

Is that right?

Thank you.

KK
• February 5th 2007, 08:02 AM
ThePerfectHacker
Quote:

Originally Posted by tttcomrader
Let r,s,t be integers. If r=st+1, prove gcd(r,s) = 1.

It is much simpler then that,
$\gcd (a,b) =1$ for $a,b>0$.
If and only if there exists integers $x,y$ such as,
$ax+by=1$.

Thus, given,
$r=st+1$
$r-st=1$
$r(1)+s(-t)=1$
Since there exists integers $1,-t$ such that,
$rx+sy=1$.
We conclude assuming $r,s>0$ that $\gcd(r,s)=1$.
• February 5th 2007, 08:04 AM