Originally Posted by

**chainrule** I have a function $\displaystyle f(x) = 1/\sqrt(1-x)$ defined for x in

$\displaystyle ]-1,1[$ and centered at a = 0. The taylor polynomial for this would be $\displaystyle T_{n}(x) = \sum_{k=0}^n \frac {f^{(k)}(0)x^k}{k!}$.

Now for every $\displaystyle x \ \epsilon \ ]0,1[$, there is $\displaystyle \xi = \xi(x) \ \epsilon \ ]0,x[$ such that the error would be $\displaystyle R_{n}(x) = \frac {xf^{(n+1)}(\xi)(x-\xi)^n}{n!}$.

Using this, how do I prove that the Taylor polynomials $\displaystyle T_{n}$ converge uniformly over [0,x]?

By doing a ratio test I get $\displaystyle \lim_{n\rightarrow\infty} |{\frac {R_{n+1}(x)}{R_{n}(x)}}| = \frac{2n+3}{2n+2}$. How is this supposed to converge? Can somebody please explain this?