# Thread: Proof (preferably related to geometrical intuition) of the validity of Taylor Series?

1. ## Proof (preferably related to geometrical intuition) of the validity of Taylor Series?

I understand that a funtion can be approxiamted by the following equation:

$\sum^{\infty}_{n=0} \frac{f^{(n)}(a)}{n!}(x-a)^n$

But I don't understand why??? I mean, is there some explaination (proof) that involves geometrically (as in something that uses a x-y graph) intuitive concepts? And if there is no such proof, can somebody provide a proof of the fact that the above equation (when the sum is repeated indefinantly) is infact equal to the function in question? And I'd prefer a simpler (in terms of mathematical knowledge required to understand it) proof if possible. But, if there is no way around the complexity of the proof, can somebody at least explain parts of the proof in their post? Thanks in advance

2. The reasoning is not geometric, it's numerical. The simplest reasoning is that there is no way to evaluate all values of transcendental functions exactly. We may know some important information about them (e.g. that they are differentiable, what their derivatives are, specific values of this function), but we don't know ALL the important information...

Really the only calculations of numbers that can be done are addition, subtraction, multiplication, division and exponentiation. So for transcendental functions, the only way to evaluate them at any point is to find the correct combination of the five operations. Thus, a polynomial, which is a general combination of the five, is required.

So supposing that we have a function $f(x)$ and we want to evaluate it at some point $x^*$. The only way we can do this is if we know the correct combination of the five operations. So we will write down a general polynomial and hope to find the correct combination from there...

$f(x) = c_0 + c_1(x - a) + c_2(x -a)^2 + c_3(x - a)^3 + c_4(x - a)^4 + \dots$.

Notice that if we were to substitute $x = a$ into the equation, it would eliminate everything except $c_0$. So

$f(a) = c_0 + c_1(a-a) + c_2(a-a)^2 + c_3(a-a)^3 + c_4(a-a)^4 + \dots$

$c_0 = f(a)$.

Now here is the problem - how do we find the other constants? We need an operation that will reduce exponents and remove the known constants. This operation is differentiation. So if $f(x)$ is (infinitely) differentiable in a close enough neighbourhood to $x = a$, we can take the derivative of both sides

$f'(x) = c_1 + 2c_2(x - a) + 3c_3(x - a)^2 + 4c_4(x - a)^4 + 5c_5(x - a)^5 + \dots$.

Now if we let $x = a$ we can eliminate everything but $c_1$...

$f'(a) = c_1 + 2c_2(a-a) + 3c_3(a- a)^2 + 4c_4(a - a)^4 + 5c_5(a - a)^5 + \dots$

$c_1 = f'(a)$.

Differentiating both sides again gives

$f''(x) = 2c_2 + 3\cdot 2c_3(x-a) + 4\cdot 3c_4(x - a)^2 + 5\cdot 4c_5(x - a)^3 + 6\cdot 5c_6(x - a)^4 + \dots$.

Letting $x = a$ we find

$f''(a) = 2c_2$

$c_2 = \frac{f''(a)}{2}$.

Differentiating both sides again gives

$f^{(3)}(x) = 3\cdot 2c_3 + 4\cdot 3\cdot 2c_4(x - a) + 5\cdot 4\cdot 3c_5(x - a)^2 + 6\cdot 5\cdot 4c_6(x - a)^3 + \dots$.

Letting $x = a$ we find

$f^{(3)}(a) = 3\cdot 2c_3$

$c_3 = \frac{f^{(3)}(a)}{3\cdot 2}$

$c_3 = \frac{f^{(3)}(a)}{3!}$.

Are you starting to see a pattern here? If we continue on this way to evaluate all the constants, we find

$f(x) = f(a) + f'(a)(x - a) + \frac{f''(a)}{2}(x - a)^2 + \frac{f^{(3)}(a)}{3!}(x - a)^3 + \frac{f^{(4)}(x)}{4!}(x - a)^4 + \dots$

$f(x) = \sum_{n = 0}^{\infty}\frac{f^{(n)}(a)}{n!}(x-a)^n$

and now if we wanted to evaluate the function at $x = x^*$, we can substitute $x^*$ into the Taylor polynomial and simplify to whatever degree of accuracy we like. This is also the method that calculators use - they are programmed with Taylor Polynomials...

3. Originally Posted by Prove It
Are you starting to see a pattern here? If we continue on this way to evaluate all the constants, we find

$f(x) = f(a) + f'(a)(x - a) + \frac{f''(a)}{2}(x - a)^2 + \frac{f^{(3)}(a)}{3!}(x - a)^3 + \frac{f^{(4)}(x)}{4!}(x - a)^4 + \dots$

$f(x) = \sum_{n = 0}^{\infty}\frac{f^{(n)}(a)}{n!}(x-a)^n$
Yes, I see the pattern, its crystal clear now. Thank you, the general taylor series makes so much more sense now. Obviously, this seems to prove that any function can be represented by some infinite series of terms of a polynomial, but "it seems" is never a valid form of evidence. So, even with my limited experience in Real Analysis, I'm gona make an attempt to slightly increase the rigor in my statement. Would it be more accurate to say:

Any function $f$ may be approxiamted, with a Taylor Series, to any desired level of accuracy, if the following requirements are met:

[1] $f$ is differentiable at $a$.

[2] $f$ is 'infinitely differentiable' at $a$. When I say 'infinitely differentiable' I mean that, when:

$f^{(n)}(x)$ denotes the $n'th$ derivite of $f(x)$

And given that:

$f^1(x) = f'(x) \neq 0$ or $f^1(x) = f'(x) \neq c$

(where $c$ is a constant)

Then whenever:

$f^{(n)}(x)$

is a non-trivial nth derivitive, it implies that:

$f^{(n+1)}(x)$

is a non-trivial nth derivitive.

I'm certain that the above is very, very, very far from the proper mathematical requirements of being "rigourus". I'm also quite certain that the stipulations are erroneous, but there is a point to them. I was hoping I could show what my understanding of which functions can be approximated by taylor series is, and in seeing that, anybody reading could point out the errors in my understanding specifically. That way, my misuderstanding is fixed at the source, rather then me reading a general response to the second question. Thanks in advance

4. I don't think you are very far off but if you read the wiki page on Taylor series there is an example of an infinitely differentiable function that is not equal to its Taylor series.

If you consider complex analysis, in there if a function is infinitely differentiable it indeed has a Taylor series, but we are talking about real analysis now.