Originally Posted by

**Glitch** **The question:**

Use the mean value theorem to show that sin(t) < t whenever t > 0.

**My attempt:**

I took the function to be sin(t), and the interval [0, t].

$\displaystyle \frac{f(t) - f(0)}{t} = cos(c)$

$\displaystyle \frac{sin(t)}{t} = cos(c)$

Now, at this point, I'm not sure how to deal with cos(c). Usually I'd try to find an upper bound for the RHS, and then turn the equation into an inequality (which then simplifies to the desired result). However, cos(c) oscillates between -1 and 1 in the interval (0, t). Now, I could cheat and look at the original question to see where I'm supposed to end up, but I'd like find out the proper way of dealing with this.

I'm sure it's simple, but I'm just not seeing it. Any assistance would be great!