# Math Help - Difference between x = a and x = 0 in a Taylor series

1. ## Difference between x = a and x = 0 in a Taylor series

I know taylor series says that:

Let $f$ be a function with derivatives of all orders throughout some interval containing
$a$ as an interior point. Then the Taylor series generated by $f$ at $x = a$ is:

\begin{align*}\sum_{k=0}^{\infty} \frac{f^{(k)}(a)}{k!}(x-a)^k = f(a) + f'(a)(x - a) +& \frac{f''(a)}{2!}(x-a)^2 + \cdots +\\& + \cdots + \frac{f^{(n)}(a)}{n!}(x-a)^n + \cdots.\end{align*}

So what do we mean when we say at $x = a$ for the above definition? And what is meant when we use the above definition for $x=0$?

Is it possible to kindly clarify these two (I'm not sure about the difference between two)?

2. ## Re: Difference between x = a and x = 0 in a Taylor series

Taylor Series -- from Wolfram MathWorld

"The Taylor (or more general) series of a function f(x) about a point a up to order n may be found using Series[f, {x, a, n}] ......."

3. ## Re: Difference between x = a and x = 0 in a Taylor series

I understand it as centered at a. When it's centered at 0 its called the Mclaurin Series.

4. ## Re: Difference between x = a and x = 0 in a Taylor series

Originally Posted by delgeezee
I understand it as centered at a. When it's centered at 0 its called the Mclaurin Series.
Thanks a lot.