Prove Taylor series of sqrt(1-x) converges for (-1, 0]

Write the Taylor series for , centered at 0. Prove that the series converges to in (-1, 0]

My attempt:

By finding differentials of at x = 0, I wrote the Taylor series as

Sigma(n=0 to infinity) - (1/2^n) x^n

This is of form Sigma(n=0 to infinity) an x^n, where an = -(1//2^n)

Therefore it is a power series centered at 0

Let R = radius of convergence of the series.

1/R = lim(n->infinity) |an|^(1/n) = lim(n->infinity) (1/2^n)^(1/n) = 1/2

Or, R = 2

Therefore the series converges if x is in (-R, R) = (-2, 2)

(-1, 0] is a subset of (-2, 2)

Therefore the series converges if x is in (-1, 0]

So it is proved that it converges. But how to prove that the series Sigma(n=0 to infinity) -(1/2^n) x^n converges to sqrt(1-x) ?

Re: Prove Taylor series of sqrt(1-x) converges for (-1, 0]

Hey visharad.

There should be a result that a Taylor series function that converges is unique and thus the same as the function it represents.

The alternative is to use the epsilon delta definition or results from the hilbert space theory (regarding infinite series). You can show for example that the series is Cauchy and from there link it to other similar results.

I think the first suggestion should be enough.

Re: Prove Taylor series of sqrt(1-x) converges for (-1, 0]

Thank you

I just now realized that the series is actually

1 + Sigma(1 to infinity) -x^n/(2^n n!) -----(A)

= 1 + Sigma(1 to infinity)an x^n

where an = -1/(2^n n!)

an+1/an = -1/2(n+1)

Or |an+1|/|an| = 1/2(n+1)

1/R = lim(n->infinity) |an+1|/|an| = 0

Therefore R = infinity

Therefore the series converges for all values of x. Therefore it also converges if x is in (-1, 0]

But suppose the problem only contains the series in equation (A). How to prove that it converges to (-1, 0] ?

Re: Prove Taylor series of sqrt(1-x) converges for (-1, 0]

Hey visharad, I think you are going in the wrong direction here. Firstly, the Taylor series you have written for the function is wrong. Check again. And I would suggest, that once you work it out by hand, you can check if it is right on wolfram alpha. Just type taylor series of sqrt(1 - x) there and you will get it. Also the Taylor series of sqrt(1-x) centered at x=0 is not convergent to sqrt(1-x) everywhere as you mention! The best way to approach this is to use the idea of analyticity.

A function is analytic in an open disc centered at b (in this case an open interval) if and only if its Taylor series converges to the value of the function at each point of the disc.

Do you see the if and only if statement there? It means that you can use one to prove/disprove the other. So now look at the point x = 1. The function is not analytic at that point. So if |x| >= 1, you have a non-analytic point in your open interval which implies that the Taylor series expanded about 0 does not converge to the value of the function sqrt(1-x) at that x. But for |x| < 1, the function is analytic in the open interval (-y,y) (where is |x| < y < 1) and so at every point in this interval the series converges to the function value.

So the answer is that it is convergent to the function value if |x| < 1 => it converges in (-1,0]. Note here that a Taylor series of the function expanded about 5 would be convergent to the function value in (1,9).

Re: Prove Taylor series of sqrt(1-x) converges for (-1, 0]

Quote:

Originally Posted by

**twizter** Hey visharad, I think you are going in the wrong direction here. Firstly, the Taylor series you have written for the function is wrong. Check again.

Yes, I checked again and later realised and got the correct series. has nth term = (1/2, n)/n! * x^n, where (1/2, n) = (1/2)(1/2-1)...n factors (for n=1,2,...) and (1/2, 0) = 1

Quote:

Originally Posted by

**twizter** A function is analytic in an open disc centered at b (in this case an open interval) if and only if its Taylor series converges to the value of the function at each point of the disc. Do you see the if and only if statement there? It means that you can use one to prove/disprove the other. So now look at the point x = 1. The function is not analytic at that point. So if |x| >= 1, you have a non-analytic point in your open interval which implies that the Taylor series expanded about 0 does not converge to the value of the function sqrt(1-x) at that x. But for |x| < 1, the function is analytic in the open interval (-y,y) (where is |x| < y < 1) and so at every point in this interval the series converges to the function value.

So, this means the function sqrt(1-x) has derivatives of all orders for |x| < 1. Therefore the Taylor series converges to sqrt(1-x) for |x| < 1 and therefore also for x in (-1, 0]. Is this correct?

Re: Prove Taylor series of sqrt(1-x) converges for (-1, 0]

Yes that is right. Though it is important that you say the Taylor series "centered at 0" (or expanded about 0). And note the word "disc". That shows that you need to take equal lengths in both directions around 0 for the open interval. The point x = 1 is the only non-analytic point for this function. So if you expand the function about any point b other than 0, largest disc around it not containing 1 will have taylor series convergence to the function inside it.

Correction for my previous post: The example of expanding about 5 was not correct for this problem if you are working with real valued functions (as the function would not be defined for x = 5. Instead if it was expanded at -5, it would be convergent in (-11,1). Sorry about that.