If x^2 <> 2 then
(0.5(x+2/x))^2 - 2 = 0.25 *(x^2 + 4 +4/x^2) - 0.25*8 = 0.25*(x-2/x)^2 > 0
f(n) is defined by setting f(1) = 2 and
f(n) = 0.5( f(n-1)+ 2/(f(n-1)))
I need to prove that f(n)^2 will always be bigger than 2, I tried doing this by finding and expression for f(n)^2 and then differentiating with respect to f(n-1) which gave me a minimum point of 2 for f(n)^2 however this implies that f(n)^2 can equal 2 or more, not just more than 2 that was required.
Can anybody tell me where I'm going wrong, thanks!
My proof is OK because it starts off with the assumption that x is not a square root of 2.
So I have proved that if x^2<> 2 it stays that way and that all iterations apart from possibly the starting value are greater than the square root of 2, so I did prove what you asked in the first place. But you are right that I haven't proved that the value converges to the square root of 2 yet.To do that I'd need to establish an upper bound for f(n) such that f(n)^2-2 tended to 0. I'm sure this can't be difficult. The easiest way would probably be to show that (f(n)^2-2)/(f(n+1)^2-2) was less than some k with k < 1. Then f(n) would have to tend to a square root of 2.
My epsilon delta proof skills are rusty, but here are some pointers that should let you prove this properly.
f(n+1)/f(n) = 0.5(x+2/x)/x = 0.5 + 1/x^2.
So for f(1)=2 the sequence is decreasing and bounded below by the square root of 2 so must converge to something. And f(n+1)/f(n) is strictly less than 1 except at x = square root 2 which surely means that no greater value can be a lower bound.
In fact if f(n)^2 = 2 + e
then f(n+1)^2 = 0.25(2+e) + 1 + 1/(2+e)
= 1.5 + 0.25e + 0.5(1+e/2)^-1 which for small e =
= 1.5 + 0.25e + 0.5 (1-e/2 + e^2/4 - e^3/8...)
which is approximately 2 + e^2/8 which shows the convergence to square root of 2 is quadratic.