[SOLVED] inductive proof

Oct 2009
8
0
for n in the natural numbers prove that (2+i)^n is never real.

base case
(2+i) is not real

inductive hypothesis
for k=n assume (2+i)^k is not in the reals
must prove
(2+i)^(k+1)

Ive got to the point where
2(2+i)^k + i(2+i)^k must not be real. its seems logical that the i terms would not cancel out making this sum not real but i cant figure out a way to prove it.
I've tried using the binomial theorem. Any HINTS would be greatly appreciated.
 

Defunkt

MHF Hall of Honor
Aug 2009
976
387
Israel
If \(\displaystyle (2+i)^k\) is not real then you can write it as \(\displaystyle a + i \cdot b\) for \(\displaystyle a, b \in \mathbb{R}\). Now, what is \(\displaystyle (2+i)(2+i)^k\) ?
 
Last edited:

Defunkt

MHF Hall of Honor
Aug 2009
976
387
Israel
Correct. Now use my hint to prove the inductive step.
 
  • Like
Reactions: epreble
Oct 2009
8
0
interesting so following your reasoning
\(\displaystyle (2+i)(2+i)^k\) = (a+ib)(2+i)
so the expansion would be 2a-b + i(2b+a)
its clear that this cannot be zero unless a and b are zero because for the sum to be zero b=2a and b= -a/2. But how do we prove that b=-a/2, making the imaginary term zero we know nothing about a and b besides that they are reals