Results 1 to 7 of 7
Like Tree4Thanks
  • 1 Post By Trefoil2727
  • 1 Post By topsquark
  • 1 Post By Deveno
  • 1 Post By HallsofIvy

Math Help - Taylor's series, proof

  1. #1
    Boo
    Boo is offline
    Junior Member
    Joined
    Jan 2013
    From
    Pizza
    Posts
    57

    Taylor's series, proof

    Please, who would be a good soul to tell me the proof of Taylor's theorem?
    How did he came to this?


    Many thanks if....caus ei just find it anywhre!!!
    Follow Math Help Forum on Facebook and Google+

  2. #2
    Member
    Joined
    Apr 2013
    From
    Mahinog
    Posts
    91
    Thanks
    5

    Re: Taylor's series, proof

    maybe you should check on wikipedia, or just google search
    Thanks from topsquark
    Follow Math Help Forum on Facebook and Google+

  3. #3
    Boo
    Boo is offline
    Junior Member
    Joined
    Jan 2013
    From
    Pizza
    Posts
    57

    Re: Taylor's series, proof

    I tried...but don't gfind anything I could understand...
    Follow Math Help Forum on Facebook and Google+

  4. #4
    Forum Admin topsquark's Avatar
    Joined
    Jan 2006
    From
    Wellsville, NY
    Posts
    10,110
    Thanks
    382
    Awards
    1

    Re: Taylor's series, proof

    See if helps.

    -Dan
    Thanks from Boo
    Follow Math Help Forum on Facebook and Google+

  5. #5
    Boo
    Boo is offline
    Junior Member
    Joined
    Jan 2013
    From
    Pizza
    Posts
    57

    Re: Taylor's series, proof

    I think problem is stil here:
    I read this:

    f(x)=\underscript{a}_{0}+\underscript{a}_{1}x^1+..  .+\underscript{a}_{n}x^n

    Now, we should setx=a+h, where a and h are not defined, but any possible numbers.

    Now, I simply do not understand what we ACTIALLY WANTED TO DO!!!
    Could someone explain me that, more clearly???
    Many thanks!

    p.s. before all, it shoul nto be the same a as form the polynomial...it is just the same variable name???
    Last edited by Boo; August 1st 2013 at 12:12 AM.
    Follow Math Help Forum on Facebook and Google+

  6. #6
    MHF Contributor

    Joined
    Mar 2011
    From
    Tejas
    Posts
    3,401
    Thanks
    762

    Re: Taylor's series, proof

    This is not a proof, but rather a sort of explanation.

    Given a "suitably nice function" f (and by "suitably nice" I mean "smooth" or "infinitely differentiable") we would like to be able to approximate it with a kind of function we know well, and is easy to work with.

    Well, polynomial functions are very easy to understand, so they are a good candidate for "replacing f with a similar, but 'nicer' function". If we want to be VERY precise (or perhaps even...."exact") in our approximation, we might need a very-high degree polynomial. It turns out there is a name for "infinite degree polynomials"" of a nice sort....power series.

    So, we would like to be able to express f in the form:

    f(x) = \sum_{k=0}^{\infty} c_k(x - a)^k = c_0 + c_1(x - a) + c_2(x - a)^2 + c_3(x - a)^3 + \cdots

    Obviously we can't "go on forever", but if k goes from 0 to some suitably large number N, we'll get "close enough" if the coefficients ck shrink fast enough. This is called a "Power series expansion" about the point of interest, a. Now, how can calculus help us, here?

    Well, the 0-th derivative of f (the function itself), tells us what value f has at a. So this will be our "constant term" which gives us our "starting point". At the point x = a, all the (x-a) powers will be 0, and so our constant term should be f(a). So any "reasonable" power series expansion for f at a, should have: c0 = f(a).

    If we want the best LINEAR approximation, what do we need to know? the SLOPE of f at a. We know this information is provided by the derivative f'(a). So what line goes through f(a), and has slope f'(a)? Using the point-slope formula for a line, we get:

    y - f(a) = f'(a)(x - a), or:

    y = f(a) + f'(a)(x - a).

    So our best "first-order expansion" is with c0 = f(a), c1 = f'(a).

    Now, what does the SECOND derivative of f at a, tell us? Which way f is "curving" (up, or down). This gives us a little bit more "bendy" approximation, which "fits better". It turns out the "best fit parabola" for f is:

    y = f(a) + f'(a)(x - a) + (1/2)f''(a)(x - a)2.

    Do you see where this is going? Let's differentiate the function y(x) we have at this point:

    y'(x) = f'(a) + f''(a)(x - a)

    At the point a, we have y'(a) = f'(a), so the slope of the parabola at the point a is the same as the slope of f ! Not bad, eh? Differentiating again:

    y"(x) = f"(a), so at the point a, y"(a) = f"(a)...the parabola has the same SECOND derivative as f at a, as well! So the parabola is "curving the same way at a as f does". So, already, our function y is acting "quite a bit like f", even though it's just a parabola. I think we may be onto something, here.....

    It turns out that taking c_k = \frac{f^{(k)}}{k!} often provides a VERY GOOD approximation to f, that is the general idea behind Taylor's theorem.

    Now the PROOF is somewhat more difficult: we have to show our "good guess" for the coefficients of our power series for f at a really does "converge" to f (under suitable conditions). And this is usually done by showing the "remainder term" (all the parts up to infinity we haven't added on yet) becomes as small as we like, as long as we make k = N "big enough", which gets a bit technical. The basic strategy is to show there is some number depending on N that shrinks as N gets large, that BOUNDS the remainder term. And the formula for this "remainder term" is a bit daunting: it involves integrals typically, and the argument can get a bit messy.

    But don't let that obscure the basic simple idea: we are approximating a function by a "truncated power series" (a polynomial), letting us describe a large family of functions by ones we know and love much better.
    Thanks from topsquark
    Follow Math Help Forum on Facebook and Google+

  7. #7
    MHF Contributor

    Joined
    Apr 2005
    Posts
    16,057
    Thanks
    1685

    Re: Taylor's series, proof

    You titled this "Taylor's series" which is NOT the same as "Taylor's theorem". Taylor's theorem states that, as long as the n+1 derivative of f exists on an interval, with a in that interval, the error in replacing f(x) by f(a)+ f'(a)(x- a)+ (f''(a)/2)(x- a)^2+ ...+ (f^(n)(a)/n!)(x- a)^n is less than or equal to (M/(n+1)!)|x- a|^{n+1}| where M is an upper bound on the value of the n+1 derivative of f on that interval.

    I would expect the proof to be in any Calculus text. I don't remember it all but I think it requires the "extended mean value theorem", that if f and g are both differentiable on interval [x_0, x_1] then there exist c in that interval such that \frac{f(x_1)- f(x_0)}= f'(c)(c- c_0). I believe it take g(x)= (x- a)^{n+1}) and applies that theorem n times.

    You first say you couldn't find the proof but after being directed to Wikipedia say you could not find a proof that you could understand. I don't see how we could possibly direct you to a proof that you could understand when we don't know what you can or cannot "understand".
    Last edited by HallsofIvy; August 1st 2013 at 06:06 AM.
    Thanks from topsquark
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. Replies: 3
    Last Post: August 16th 2010, 12:32 AM
  2. Taylor series of cos proof
    Posted in the Calculus Forum
    Replies: 3
    Last Post: March 28th 2010, 04:37 PM
  3. Even/Odd proof involving Taylor Series
    Posted in the Calculus Forum
    Replies: 4
    Last Post: March 28th 2010, 02:22 PM
  4. Replies: 0
    Last Post: January 26th 2010, 08:06 AM
  5. Replies: 9
    Last Post: April 3rd 2008, 05:50 PM

Search Tags


/mathhelpforum @mathhelpforum