The question: Prove that if x and y are distinct real numbers, then (x+1)^2=(y+1)^2 iff x+y=-2. How does the conclusion change if we allow x=y.
(The professor's instruction require that we assume a^2=b^2 implies a=b or a=-b)
This is what I have down so far:
pf | Let x,y be distinct real numbers, then (x+1)^2=(y+1)^2 iff x+y=-2
assume a^2=b^2 implies a=b or a=-b
Let P be the statement that (x+1)^2=(y+1)^2, and Q be the statement that x+y=-2
We will show that 1. P implies Q, and 2. Q implies P
1. Showing P implies Q
x=y, but since x and y are distinct numbers, we ignore this case
x+1 = - (y+1)
x+1 = -y-1
x + y = -2
(Did what I just do there show that P implies Q?)
2. Q implies P
To show this, do I just solve x+y=-2, plug it back into P, and then confirm that P is in fact true?
And then, since I have shown that both P implies Q and Q implies P, does that mean my proof is finished?
Also, if we allow x=y then does the conclusion change at all? Seems like it wouldn't because if I used the instructors instructions (assume a^2=b^2 implies a=b or a=-b), then the statements still hold true (because x=x or x=-x, while x=-x isn't true, x=x is, and we just need one of those statements to be true).
Please let me be on track...I'm SO lost in this course...