Originally Posted by

**Math Major** Suppose that $\displaystyle g:[0,1] \rightarrow \mathbb{R} $ is continuous, g(0) = g(1) = 0, and for any $\displaystyle c \in (0,1) $ there is some $\displaystyle k > 0 $ such that:

$\displaystyle 0 < c-k < c < c+k < 1 $ and $\displaystyle g(c) = \frac{1}{2}(g(c+k) + g(c-k)) $.

Prove that g(x) = 0 for any x in [0,1].

My work:

Since g is a continuous mapping of a closed interval, it must achieve its maximum, say M, on [0,1]. So, let A = { $\displaystyle x \in [0,1] : g(x) = M $}. Since g achieves its maximum, A is nonempty. A is clearly bounded above by 1, so $\displaystyle sup(A) = x_0 $ exists.

Suppose that $\displaystyle x_0 \ne 0 $. Take $\displaystyle x_0 > 0$. Then, $\displaystyle x_0 \in (0,1) $ (since g(0) = 0 = g(1)).

Then, there is a k > 0 such that $\displaystyle (x_0 - k) $ and $\displaystyle (x_0 +k) \in (0,1) $ and that $\displaystyle M = g(x_0) = \frac{1}{2} (g(x_0 + k) + g(x_0 -k)) $.

But, since M is the maximum of g, this implies that $\displaystyle g(x_0 + k) = g(x_0 -k) = g(x_0) = M $.

This is where I'm getting stuck. Can anyone point me in the right direction?