Use a taylor polynomial to approximate $\displaystyle f(x)=\frac{1}{1+x}$ at x=0.1 with error less than 0.001.
The n-th derivative of f is $\displaystyle f^{(n)}(x)=\frac{(-1)^n n!}{(1+x)^{n+1}}$. If you develop f around $\displaystyle x_0=0$ you get
$\displaystyle f(x) = 1-x+x^2-x^3+\cdots +\frac{(-1)^n x^n}{(1+\theta x)^{n+1}}$
Where $\displaystyle \theta \in (0,1)$. To make the error term for $\displaystyle x=0.1$ smaller than 0.001, we solve the following (worst case) inequality for n
$\displaystyle \left|\frac{(-1)^n 0.1^n}{(1+0\cdot 0.1)^{n+1}}\right|< 0.001$ which gives, if I'm not mistaken, that $\displaystyle n>2.958$. Thus we take $\displaystyle n=3$ and get:
$\displaystyle f(0.1) \approx 1-0.1+0.1^2=0.91$. Compare this with the exact value of $\displaystyle f(0.1)=\frac{1}{1+0.1}=0.\overline{90}$: the resulting error $\displaystyle 0.91-0.\overline{90}=0.000\overline{90}$ is less than 0.001, as required.