These functions ought to remind you of functions called c** and s** that you have met before. That gives a motivation for the definition that follows.

Define

. Its derivative is

. So f is constant, and by evaluating it at 0 you see that the constant is 1. Therefore

. This shows in particular that

for all

.

Now you can apply Taylor's Theorem with Lagrange's form of the remainder (taking the first two terms plus remainder) to see that

for some

between 0 and

, and this must be

. Now do the same, taking four terms of the Taylor series, plus remainder, to get the other inequality.

Originally Posted by

**pearlyc** (b) Deduce that the function

has no zeros in the interval [0,1.4] but at least one zero in the interval [1.4,1.6]. Let

denote the smallest such zero. Prove that

This follows easily from the intermediate value theorem together with the magic formula

.

Originally Posted by

**pearlyc** ...

and

I don't see an easy way to get these results. You want to show that

and

. The only way I can see to do this is to show that

is the sum of its Taylor series

and then to show that the function

is the sum of the same Taylor series.