# Math Help - Taylors theorem question.

1. ## Taylors theorem question.

Hi, I have been stuck on the following for ages, any guidance would be welcome.

A function f: R --> R is C^3 and,

f(a+h)= f(a) + f'(a + 0.5h)h

for real a and h 0

by applying Taylors theorem to f and f' show that the third

derivative of f is identically 0.

I can see that f'(a + 0.5h)h is the Lagrange error term

and I know that similarly f'(a+05h)=f'(a)+0.5f'' $\delta$h

Therefore f(x)= f(a) + (f'(a) + 0.5 f''(a))(x-a).

Is this even the right way to attempt the question? If so, how does the above imply that the third derivative of f is always zero?

2. Taylor's theorem for f says $f(a+h) = f(a) + hf'(a) + \tfrac12h^2f''(a) + \tfrac16f'''(a+\theta h)$ ( for some $\theta$ between 0 and 1).

Taylor's theorem for f' says $f'(a+
\tfrac h2) = f'(a) + \tfrac h2f''(a) + \tfrac12\bigl(\tfrac h2\bigr)^2f'''(a+\phi h)$
( for some $\phi$ between 0 and 1/2). Multiply that equation by h, subtract it from the previous one, and use the information that $f(a+h) = f(a) + hf'(a+\tfrac h2)$. You'll end up with the equation $4f'''(a+\theta h) = 3f'''(a+\phi h)$. Now let $h\to0$ to see that $4f'''(a) = 3f'''(a)$, from which obviously $f'''(a)=0$.