could anyone help me out with this? thankx
use taylor approximation theorem to estimate e with error less than 10^-6
Hi
Let $\displaystyle f:x\mapsto \exp x$. $\displaystyle f\in\mathcal{C}^{\infty}(\mathbb{R})$ hence for any integer $\displaystyle n$ and for all $\displaystyle a,\,b\in \mathbb{R}$, Taylor's theorem states that
$\displaystyle f(b)=\sum_{k=0}^n\frac{f^{(k)}(a)}{k!}(b-a)^k+R_n$ with $\displaystyle R_n=\frac{f^{(n+1)}(c)}{(n+1)!}(b-a)^{n+1}$ for some $\displaystyle c\in[a,b]$.
As we want to evaluate $\displaystyle \exp 1$ we choose $\displaystyle b=1$. $\displaystyle a$ can be any real number, let's choose $\displaystyle a=0$, it'll make things easier. For any integer $\displaystyle k$ and for any real number $\displaystyle x$, $\displaystyle f^{(k)}(x)=\exp x$ hence the previous equality becomes
$\displaystyle \exp 1=\sum_{k=0}^n\frac{\exp 0}{k!}(1-0)^k+\frac{\exp c}{(n+1)!}(1-0)^{n+1}=\underbrace{\sum_{k=0}^n\frac{1}{k!}}_{\t ext{approximation}}+\underbrace{\frac{\exp c}{(n+1)!}}_{\text{error}}$ for some $\displaystyle c\in[0,1]$.
As we want an error $\displaystyle <10^{-6}$, you have to find $\displaystyle n$ such that $\displaystyle \frac{\exp c}{(n+1)!} <10^{-6}$. Can you take it from here ?