I think I've got it. I'm pretty sure it follows from the magic of Taylor's theorem. So disregard.
I'm not exactly a beginner at this, but for some reason I've hit a wall on this simple result.
Suppose f:R-->R is twice continuously differentiable on an open set about a in the reals. Does it follow that
f ''(a) = limit (h-->0) (1/h^2)*(f(a+h)+f(a-h)-2f(a)) ?
What if we assume that f(t) is less than or equal to f(a) for t in a neighborhood of a?
I'm not sure how the proof I'm reading arrived at the above limit, but I'm suspicious of this: the second derivative is defined in terms of two iterated limits (say limit s->0 lim h->0 F(s,h)). In the above, we have only h. So it looks like we're taking the limit along the diagonal s=h (of something). But how is this justified? Thanks for any help.