So what I want is f(t)>t and f(t)-t>0.
Let g(t) = f(t)-t. Let g(0) =0.
By the Mean Value Theorem, since f(t) is cont. and differentiable, then there is a c in (- infinity, 0) such that
f(0) - f(-infinity) = f '(c) (0 - -infinity)
And for any x1 and x2 in [- infinity, 0] with x1<x2, there exists a c in (x1,x2) such that g(x2)-g(x1) = g '(x)(x2-x1).
g ' (x) = cos x-1 <0 for x in (- infinity, 0).
if g '(x) <0 for all x in (- infinity, 0), then f is strictly increasing.
Can you guide me more, I find myself a bit confusing at times ...
First you shouldn't write "f(-infinity)" or take x1=-infinity... this makes no sense indeed (especially since f has no limit at -infinity).
I didn't take the title "Mean value theorem" into account when I wrote my previous post; I thought you could use the result "If is differentiable on and for every then is strictly decreasing on ". This would give you for , which is what you want.
Now, using the mean value theorem... If you apply it to on where , you get for some , hence (since and ). It doesn't give you the strict inequality though (you may a priori have ). I don't know how you could get it easily from the mean value theorem.
If using your method, when you mention using "If g is differentiable on (-infinity,0] and g '(x)<0 for every x<0 then g is strictly decreasing on (-infinity,0]".
I will then start by saying that let g be differentiable on (-infinity,0], that is, g ' (x) = cos x-1 <0 for x in (- infinity, 0).
Then do you know how I should continue on so to really show that g '(x)<0 from your way?
DefineShow for .
By algebra of continuity if and then since then is continuous on .
Say is continuous on where and .
is differentiable on . In fact
Hence, by the Mean Value Theorem .
.
Hence (ie. strictly decreasing)
and .
Now we have
Hence .
Therefore define so we get as required.