f(n) is defined by setting f(1) = 2 and
f(n) = 0.5( f(n-1)+ 2/(f(n-1)))
I need to prove that f(n)^2 will always be bigger than 2, I tried doing this by finding and expression for f(n)^2 and then differentiating with respect to f(n-1) which gave me a minimum point of 2 for f(n)^2 however this implies that f(n)^2 can equal 2 or more, not just more than 2 that was required.
Can anybody tell me where I'm going wrong, thanks!