1. Suppose f: \mathbb{R} \longrightarrow \mathbb{R} is differentiable and that f and f' have no common roots. Prove or disprove that f can have at most finitely many zeros in any interval [a,b].

(You may assume that a = 0, \ b = 1. Note that if f' is continuous on [a,b], then the claim is easily seen to be true. But do we really need f' to be continuous?)


2. Suppose f_n: \mathbb{R} \longrightarrow \mathbb{R}, \ n \in \mathbb{N}, is a pointwise convergent sequence of continuous functions on the interval [a,b]. Show that \{f_n \} is uniformly bounded on some subinterval of [a,b].

(Again you may assume that a=0, \ b=1. Note that the subinterval doesn't have to be [a,b] itself.)