# Math Help - Taylor series convergence

1. ## Taylor series convergence

I have a function $f(x) = 1/\sqrt(1-x)$ defined for x in
$]-1,1[$ and centered at a = 0. The taylor polynomial for this would be $T_{n}(x) = \sum_{k=0}^n \frac {f^{(k)}(0)x^k}{k!}$.
Now for every $x \ \epsilon \ ]0,1[$, there is $\xi = \xi(x) \ \epsilon \ ]0,x[$ such that the error would be $R_{n}(x) = \frac {xf^{(n+1)}(\xi)(x-\xi)^n}{n!}$.
Using this, how do I prove that the Taylor polynomials $T_{n}$ converge uniformly over [0,x]?
By doing a ratio test I get $\lim_{n\rightarrow\infty} |{\frac {R_{n+1}(x)}{R_{n}(x)}}| = \frac{2n+3}{2n+2}$. How is this supposed to converge? Can somebody please explain this?

2. Originally Posted by chainrule
I have a function $f(x) = 1/\sqrt(1-x)$ defined for x in
$]-1,1[$ and centered at a = 0. The taylor polynomial for this would be $T_{n}(x) = \sum_{k=0}^n \frac {f^{(k)}(0)x^k}{k!}$.
Now for every $x \ \epsilon \ ]0,1[$, there is $\xi = \xi(x) \ \epsilon \ ]0,x[$ such that the error would be $R_{n}(x) = \frac {xf^{(n+1)}(\xi)(x-\xi)^n}{n!}$.
Using this, how do I prove that the Taylor polynomials $T_{n}$ converge uniformly over [0,x]?
By doing a ratio test I get $\lim_{n\rightarrow\infty} |{\frac {R_{n+1}(x)}{R_{n}(x)}}| = \frac{2n+3}{2n+2}$. How is this supposed to converge? Can somebody please explain this?
I am sure you learned this long ago- perhaps too long ago! Divide both numerator and denominator by n to get $\frac{2+ \frac{3}{n}}{2+ \frac{2}{n}}$. Now what is the limit of that as n goes to infinity?

3. I am sure you learned this long ago- perhaps too long ago! Divide both numerator and denominator by n to get $\frac{2+ \frac{3}{n}}{2+ \frac{2}{n}}$. Now what is the limit of that as n goes to infinity?
Yes, but this Limit would give me a value of 1 whereas convergence would mean that the limit has to tend to 0 as n tends to infinity. Also this series is supposed to converge uniformly over $\xi \ \epsilon \ ]0,x[$ for every $x \ \epsilon \ ]0,1[$. How is that possible?