How to prove that (f(n+1)^2=f(n)f(n+2)+(-1)^n where f(n), f(n+1),f(n+2) are fibonaci sequence terms. I have no idea where to start.
You have to use a bit of ingenuity in the induction in order to get the (-1)^k term to change sign. One way is to start by writing
f(k+2)^2 = f(k+2)f(k) + [f(k+2) – f(k)]f(k+2).
Then for the first term on the right-hand side, use the inductive hypothesis in the form f(k+2)f(k) = f(k+1)^2 – (–1)^k. For the second term, use the fact that f(k+2) – f(k) = f(k+1) to write it as f(k+1)f(k+2). You should then be able to complete the inductive step.
Thank you Opalg, thank you danny, thank you Archie meade for getting involved. I am going to verify your suggestions but I have not done it yet. WHAT ABOUT IF I rewrite the equality as:
(f(n+1))^2-f(n)f(n+2)=(-1)^n and then substituting the terms f(n+1)=f(n)+f(n-1) and f(n+2)=f(n)+f(n+1) into the equality they get reduced. It seems that if I use the property of infinite decent, combined with well ordering property the whole left hand can be reduced to(-1)^n Can someone tell me if this will be valid? I will review the suggestions but in case that induction does not work.