Hi All.

Okay I'm having a problem in understanding the Taylor series expansion of a general polynomial. Now I understand how to derive the Maclaurin series expansion from first principles ie.

f(x) = a0 + a1x + a2x^2 + ... + ax^(n-1)

becomes:

f(x) = f(0) + [d'f(0)/dy]x + ([d''f(0)/dy]x^2) / 2! + ([d'''f(0)/dy]x^3) / 3! + ...

This is all done about a point that is close to x = 0 right?. Now according to my teacher, in a Taylor series the x-axis gets shifted an arbitrary amount to the left and so the polynomial f(x) becomes:

f(x) = a0 + a1(x - a) + a2(x - a)^2 + ... + a(x - a)^(n-1)

and its this that I'm having difficulty in understanding. Can somebody explain to me, in a different way than my teacher what is actually going on here because I'm getting very frustrated with it. It looks obvious but I can't see the wood for the trees.

Thanks.