# Math Help - Proof by Contradiction: (n+2)^3 = n^3 + (n+1)^3 ?

1. ## Proof by Contradiction: (n+2)^3 = n^3 + (n+1)^3 ?

I have been working through D. L. Johnson's "Elements of Logic via Numbers and Sets" and have come across this problem, amongst others, which has me stumped.

"Prove by contradiction that the cube of the largest of three consecutive integers cannot be equal to the sum of the cubes of the other two."

Assume, then (for contradiction) that the above is not the case, i.e.

$(n+2)^3 = n^3 + (n+1)^3$

for some integer $n$. Then

$\begin{array}{rcl}
n^3 + 3n^2 \cdot (2) + 3n \cdot (2)^2 + 8 & = & n^3 + n^3 + 3n^2 + 3n + 1 \\
n^3 + 6n^2 + 12n + 8 & = & 2n^3 + 3n^2 + 3n + 1 \\
n^3 - 3n^2 - 9n - 9 & = & 0
\end{array}$
.

I suppose that we wish to show that this is not the case for any integer $n$, to provide the necessary contradiction. I'm not sure how though I might do this. Any hints much appreciated.

2. Originally Posted by Harry1W
I suppose that we wish to show that this is not the case for any integer $n$, to provide the necessary contradiction. I'm not sure how though I might do this. Any hints much appreciated.
Don't you just solve for the polynomial's roots to show that $r_{1},r_{2},r_{3}\not\in\mathbb{Z}$? Are you really asking us how you solve for the roots? I only know how to solve quadratics w/o a calculator.

3. Originally Posted by Harry1W
I have been working through D. L. Johnson's "Elements of Logic via Numbers and Sets" and have come across this problem, amongst others, which has me stumped.

"Prove by contradiction that the cube of the largest of three consecutive integers cannot be equal to the sum of the cubes of the other two."

Assume, then (for contradiction) that the above is not the case, i.e.

$(n+2)^3 = n^3 + (n+1)^3$

for some integer $n$. Then

$\begin{array}{rcl}
n^3 + 3n^2 \cdot (2) + 3n \cdot (2)^2 + 8 & = & n^3 + n^3 + 3n^2 + 3n + 1 \\
n^3 + 6n^2 + 12n + 8 & = & 2n^3 + 3n^2 + 3n + 1 \\
n^3 - 3n^2 - 9n - 9 & = & 0
\end{array}$
.

I suppose that we wish to show that this is not the case for any integer $n$, to provide the necessary contradiction. I'm not sure how though I might do this. Any hints much appreciated.
Look at where the turning points of the cubic are. So the cubic has only one root. It lies between n = 3 and n = 6. Note that the cubic is negative for n = 5 and positive for n = 6. What do you conclude?

4. Originally Posted by mr fantastic
Look at where the turning points of the cubic are. So the cubic has only one root. It lies between n = 3 and n = 6. Note that the cubic is negative for n = 5 and positive for n = 6. What do you conclude?
Got it. Both the turning pts are below the n-axis, so there can be only one real root. How, however, did you conclude that it must lie between n = 3 and n= 6? I can understand the reasoning in the choice of n = 3, since we know it to lie beneath the n-axis, but why n = 6? Was this just a random stab in the dark? What would have happened if the root, say lay at n = 4000/3 (i.e. a much higher rational, non-integer number)? Would one use the Newton-Raphson method, or a similar iterative process until the 'window' was sufficiently small to evaluate the value of f(x) at a few values?

5. Originally Posted by Harry1W
$n^3 - 3n^2 - 9n - 9 = 0$.

Here's another approach which may be fruitful (however, I can't promise, I haven't actually tried it).

I suppose that we wish to show that this is not the case for any integer $n$, to provide the necessary contradiction. I'm not sure how though I might do this. Any hints much appreciated.
One approach might be to note that the product of all 3 roots of this equation has to be 9.

So you can test by exhaustion whether $n-1, n-3, n-9$ are factors of this cubic. You'll find they're not. Therefore, none of 1, 3 and 9 are roots.

Because the sum of roots is 3, and the sum of pairs of roots taken 2 at a time is 9 (both of which are integers), I think you may be able to show that there are other limitations on the roots such that all other possible integral solutions are likewise eliminated.

6. Originally Posted by Harry1W
I have been working through D. L. Johnson's "Elements of Logic via Numbers and Sets" and have come across this problem, amongst others, which has me stumped.

"Prove by contradiction that the cube of the largest of three consecutive integers cannot be equal to the sum of the cubes of the other two."

Assume, then (for contradiction) that the above is not the case, i.e.

$(n+2)^3 = n^3 + (n+1)^3$

for some integer $n$. Then

$\begin{array}{rcl}
n^3 + 3n^2 \cdot (2) + 3n \cdot (2)^2 + 8 & = & n^3 + n^3 + 3n^2 + 3n + 1 \\
n^3 + 6n^2 + 12n + 8 & = & 2n^3 + 3n^2 + 3n + 1 \\
n^3 - 3n^2 - 9n - 9 & = & 0
\end{array}$
.

I suppose that we wish to show that this is not the case for any integer $n$, to provide the necessary contradiction. I'm not sure how though I might do this. Any hints much appreciated.
In
$n^3 - 3n^2 - 9n - 9 = 0$,
notice that all the terms are obviously divisible by 3 except for $n^3$. So $n^3$ must be divisible by 3, hence $n$ must be divisible by 3.

Let $n = 3x$, where $x$ is an integer. Then
$27 x^3 - 27 x^2 - 27 x -9 =0$, whence
$3(x^3 - x^2 - x) = 1$, so
$3y = 1$
where $y=x^3 - x^2 - x$ is an integer.

Don't think that will work out.