# Thread: Taylor's theorem

1. ## Taylor's theorem

Suppose that $f(x)=\ln (1+x)$

1) Express $f(x)$ in the form $p_1(x)+R_2(x)$, where $p_1$ is the first Taylor polynomial for $f$ about 0 and $R_2$ is the Lagrange formula for the remainder.

$f(x)=p_1(x)+R_2(x)$

$p_1(x)=f(0)+f'(0)x$

$\Longrightarrow p_1(x)=x$

$R_2(x)=\frac{f''(c)}{2!}x^2$

$f''(x)=-\frac{1}{(1+x)^2}\Rightarrow f''(c)=-\frac{1}{(1+c)^2}$

$f(x)=x-\frac{x^2}{2(1+c)^2}$

2) Suppose that $x\in [-0.1, 0.1]$ and consider the approximation $\ln (1+x) \approx x$. Use the answer in the first part to show that an upper bound for the absolute error in this approximation is $\frac{1}{162}$.

$|Error|=|f(x)-p_1(x)|$

$=|R_2(x)|$

How would I do this question?

2. Originally Posted by acevipa
Suppose that $f(x)=\ln (1+x)$

1) Express $f(x)$ in the form $p_1(x)+R_2(x)$, where $p_1$ is the first Taylor polynomial for $f$ about 0 and $R_2$ is the Lagrange formula for the remainder.

$f(x)=p_1(x)+R_2(x)$

$p_1(x)=f(0)+f'(0)x$

$\Longrightarrow p_1(x)=x$

$R_2(x)=\frac{f''(c)}{2!}x^2$

$f''(x)=-\frac{1}{(1+x)^2}\Rightarrow f''(c)=-\frac{1}{(1+c)^2}$

$f(x)=x-\frac{x^2}{2(1+c)^2}$

2) Suppose that $x\in [-0.1, 0.1]$ and consider the approximation $\ln (1+x) \approx x$. Use the answer in the first part to show that an upper bound for the absolute error in this approximation is $\frac{1}{162}$.

$|Error|=|f(x)-p_1(x)|$

$=|R_2(x)|$

How would I do this question?
Well, you have already determined that $R_2(x)= -\frac{x^2}{(1+ c)^2}$ for some c between -0.1 and 0.1 so that $|R_2(x)|= \frac{x^2}{(1+ c)^2}$. Now what is the largest possible value for that? Remember that a fraction achieves it maximum value when the numerator is maximum and the denominator is minimum.