Hi, I have been stuck on the following for ages, any guidance would be welcome.

A function f: R --> R is C^3 and,

f(a+h)= f(a) + f'(a + 0.5h)h

for real a and h 0

by applying Taylors theorem to f and f' show that the third

derivative of f is identically 0.

I can see that f'(a + 0.5h)h is the Lagrange error term

and I know that similarly f'(a+05h)=f'(a)+0.5f''$\displaystyle \delta$h

Therefore f(x)= f(a) + (f'(a) + 0.5 f''(a))(x-a).

Is this even the right way to attempt the question? If so, how does the above imply that the third derivative of f is always zero?