The question goes like this.

Suppose that two real-valued functions C and S are defined and differentiable on the domain $\displaystyle (-\infty,\infty)$ and satisfy the following properties:

$\displaystyle S'(\theta) = C(\theta), C'(\theta) = - S(\theta), S(0) = 0, C(0) = 1.$

(a) Use Taylor's Theorem with Lagrange's form of the remainder to prove that for all $\displaystyle \theta \in R$,

$\displaystyle 1 - \frac{\theta^2}{2} \leq C(\theta) \leq 1 - \frac{\theta^2}{2} + \frac{\theta^4}{24}$

(b) Deduce that the function $\displaystyle C(\theta)$ has no zeros in the interval [0,1.4] but at least one zero in the interval [1.4,1.6]. Let $\displaystyle \lambda$ denote the smallest such zero. Prove that $\displaystyle S(\lambda) = 1, C(\theta + 2\lambda) = -C(\theta)$ and $\displaystyle S(\theta + 2\lambda) = -S(\theta)$