Originally Posted by

**Dagger2006** The function *f* is defined by

f(x)=((x^3)+1)/3

has three fixed points, say alpha, beta and gamma where

-2<alpha<-1, 0<Beta<1, 1<gamma<2.

For arbitrarily chosen x_1, define {x_n} by setting x_n+1=f(x_n).

If x_1 < alpha, prove that x_n --> neg infinity as n ---> infinity.

Right now my thoughts is to first show that this is a mono decreasing sequence. Then I want to show that every |x_n - x_n+1 | >= d for some d. Then it should be easy to show that the sequence goes to negative infinity. My problem is showing that |x_n - x_n+1| >= d. I can't seem to figure out the correct way to say this mathematically. It is possible that this current route wont work at all and that's why I can't show it mathematically. Thanks for any assistance.

P.S How do you guys put the mathematical symbols into the forum message?

Thanks!