Originally Posted by

**acevipa** Suppose that $\displaystyle f(x)=\ln (1+x)$

1) Express $\displaystyle f(x)$ in the form $\displaystyle p_1(x)+R_2(x)$, where $\displaystyle p_1$ is the first Taylor polynomial for $\displaystyle f$ about 0 and $\displaystyle R_2$ is the Lagrange formula for the remainder.

$\displaystyle f(x)=p_1(x)+R_2(x)$

$\displaystyle p_1(x)=f(0)+f'(0)x$

$\displaystyle \Longrightarrow p_1(x)=x$

$\displaystyle R_2(x)=\frac{f''(c)}{2!}x^2$

$\displaystyle f''(x)=-\frac{1}{(1+x)^2}\Rightarrow f''(c)=-\frac{1}{(1+c)^2}$

$\displaystyle f(x)=x-\frac{x^2}{2(1+c)^2}$

2) Suppose that $\displaystyle x\in [-0.1, 0.1]$ and consider the approximation $\displaystyle \ln (1+x) \approx x$. Use the answer in the first part to show that an upper bound for the absolute error in this approximation is $\displaystyle \frac{1}{162}$.

$\displaystyle |Error|=|f(x)-p_1(x)|$

$\displaystyle =|R_2(x)|$

How would I do this question?