[note: also under discussion in s.o.s. math board]
But when you write the interval [a(n),a(n+1)] and apply the mean value theorem, it is implictly assumed that a(n)<a(n+1), which is not true for our sequence, e.g. a0=0, a1=1, a2=0.54, a3=0.86,...
How can we remove the assumption a(n)<a(n+1) while reaching the same conclusion that |a(n+2)-a(n+1)| ≤ 0.85|a(n+1)-a(n)|?
thanks.
I do leave holes on purpose you know? Do you not think I didn't realize this. You are supposed to do some of this. This is to help you learn.
But, I'll give you a hint. Does it matter which is bigger than the other? Clearly it does if we want to define an interval in the conventional way, but does it matter if we have the itneval I gave you or the reverse one? Is the result any different?
OK, I got it now.
Actually I got confused before about the "for ALL n" part in "a(n+1)>a(n) for ALL n" . I thought we need montone sequences, but acutally we don't.
Given any two consecutive terms a(n) and a(n+1), either a(n)<=a(n_1) or a(n)>a(n+1), and in either case the MVT gives the result. Done.