Please, who would be a good soul to tell me the proof of Taylor's theorem?

How did he came to this?

Many thanks if....caus ei just find it anywhre!!!

Printable View

- July 30th 2013, 08:40 AMBooTaylor's series, proof
Please, who would be a good soul to tell me the proof of Taylor's theorem?

How did he came to this?

Many thanks if....caus ei just find it anywhre!!! - July 30th 2013, 09:03 AMTrefoil2727Re: Taylor's series, proof
maybe you should check on wikipedia, or just google search

- July 31st 2013, 08:58 AMBooRe: Taylor's series, proof
I tried...but don't gfind anything I could understand...

- July 31st 2013, 10:02 AMtopsquarkRe: Taylor's series, proof
See if this helps.

-Dan - August 1st 2013, 01:10 AMBooRe: Taylor's series, proof
I think problem is stil here:

I read this:

Now, we should setx=a+h, where a and h are not defined, but any possible numbers.

Now, I simply do not understand what we ACTIALLY WANTED TO DO!!!

Could someone explain me that, more clearly???

Many thanks!

p.s. before all, it shoul nto be the same a as form the polynomial...it is just the same variable name(Headbang)??? - August 1st 2013, 06:11 AMDevenoRe: Taylor's series, proof
This is not a proof, but rather a sort of explanation.

Given a "suitably nice function" f (and by "suitably nice" I mean "smooth" or "infinitely differentiable") we would like to be able to approximate it with a kind of function we know well, and is easy to work with.

Well, polynomial functions are very easy to understand, so they are a good candidate for "replacing f with a similar, but 'nicer' function". If we want to be VERY precise (or perhaps even...."exact") in our approximation, we might need a very-high degree polynomial. It turns out there is a name for "infinite degree polynomials"" of a nice sort....power series.

So, we would like to be able to express f in the form:

Obviously we can't "go on forever", but if k goes from 0 to some suitably large number N, we'll get "close enough" if the coefficients c_{k}shrink fast enough. This is called a "Power series expansion" about the point of interest, a. Now, how can calculus help us, here?

Well, the 0-th derivative of f (the function itself), tells us what value f has at a. So this will be our "constant term" which gives us our "starting point". At the point x = a, all the (x-a) powers will be 0, and so our constant term should be f(a). So any "reasonable" power series expansion for f at a, should have: c_{0}= f(a).

If we want the best LINEAR approximation, what do we need to know? the SLOPE of f at a. We know this information is provided by the derivative f'(a). So what line goes through f(a), and has slope f'(a)? Using the point-slope formula for a line, we get:

y - f(a) = f'(a)(x - a), or:

y = f(a) + f'(a)(x - a).

So our best "first-order expansion" is with c_{0}= f(a), c_{1}= f'(a).

Now, what does the SECOND derivative of f at a, tell us? Which way f is "curving" (up, or down). This gives us a little bit more "bendy" approximation, which "fits better". It turns out the "best fit parabola" for f is:

y = f(a) + f'(a)(x - a) + (1/2)f''(a)(x - a)^{2}.

Do you see where this is going? Let's differentiate the function y(x) we have at this point:

y'(x) = f'(a) + f''(a)(x - a)

At the point a, we have y'(a) = f'(a), so the slope of the parabola at the point a is the same as the slope of f ! Not bad, eh? Differentiating again:

y"(x) = f"(a), so at the point a, y"(a) = f"(a)...the parabola has the same SECOND derivative as f at a, as well! So the parabola is "curving the same way at a as f does". So, already, our function y is acting "quite a bit like f", even though it's just a parabola. I think we may be onto something, here.....

It turns out that taking often provides a VERY GOOD approximation to f, that is the general idea behind Taylor's theorem.

Now the PROOF is somewhat more difficult: we have to show our "good guess" for the coefficients of our power series for f at a really does "converge" to f (under suitable conditions). And this is usually done by showing the "remainder term" (all the parts up to infinity we haven't added on yet) becomes as small as we like, as long as we make k = N "big enough", which gets a bit technical. The basic strategy is to show there is some number depending on N that shrinks as N gets large, that BOUNDS the remainder term. And the formula for this "remainder term" is a bit daunting: it involves integrals typically, and the argument can get a bit messy.

But don't let that obscure the basic simple idea: we are approximating a function by a "truncated power series" (a polynomial), letting us describe a large family of functions by ones we know and love much better. - August 1st 2013, 07:03 AMHallsofIvyRe: Taylor's series, proof
You titled this "Taylor's series" which is NOT the same as "Taylor's theorem". Taylor's theorem states that, as long as the n+1 derivative of f exists on an interval, with a in that interval, the error in replacing f(x) by is less than or equal to where M is an upper bound on the value of the n+1 derivative of f on that interval.

I would expect the proof to be in any Calculus text. I don't remember it all but I think it requires the "extended mean value theorem", that if f and g are both differentiable on interval then there exist c in that interval such that . I believe it take and applies that theorem n times.

You first say you couldn't find the proof but after being directed to Wikipedia say you could not find a proof that you could**understand**. I don't see how we could possibly direct you to a proof that**you**could understand when we don't know what you can or cannot "understand".