I have a function defined for x in

and centered at a = 0. The taylor polynomial for this would be .

Now for every , there is such that the error would be .

Using this, how do I prove that the Taylor polynomials converge uniformly over [0,x]?

By doing a ratio test I get . How is this supposed to converge? Can somebody please explain this?