I have a function defined for x in
and centered at a = 0. The taylor polynomial for this would be .
Now for every , there is such that the error would be .
Using this, how do I prove that the Taylor polynomials converge uniformly over [0,x]?
By doing a ratio test I get . How is this supposed to converge? Can somebody please explain this?
Yes, but this Limit would give me a value of 1 whereas convergence would mean that the limit has to tend to 0 as n tends to infinity. Also this series is supposed to converge uniformly over for every . How is that possible?I am sure you learned this long ago- perhaps too long ago! Divide both numerator and denominator by n to get . Now what is the limit of that as n goes to infinity?