I didn't know MathStud28's example. The example that I would have called the "classical" one is: for and . I can't remember which famous mathematician introduced it, perhaps Cauchy, anyway it has a long history. And it seems simpler to deal with.
For , . First notice that . Notice as well for any integer . To convince yourself of this, substitute , so that the limit is .
Now, the -th derivative of is (easily) seen to be a rational function of (a quotient of two polynomials) times . This is proved by induction (suppose there is a rational function such that the -th derivative is, for , , and prove that the next derivative is again of the same kind).
Because of the previously mentioned limit, you obtain that the n-th derivative converges to 0 as tends to 0 from the right. By symmetry, the same holds from the left.
Finally, there is a theorem (that you probably know if you're asked this problem) telling you that, as a consequence, is indefinitely differentiable at 0 and the derivatives at 0 are the limits of the derivatives at when tends to 0. That is to say, all derivatives at 0 exist and are 0. Yet, the function is only zero at zero...