We can get what you want by applying the Cauchy-Schwarz inequality (see Cauchy). The wikipedia article gives the statement of the theorem in terms of arbitrary inner-product spaces; if you know what these are and how they apply to what you've asked about, then read no further. If not I have provided an explanation below about how it helps us establish the result.
Real-valued continuous functions defined on [0,1] form a vector space over the real numbers (meaning if we add two real-valued continuous functions defined on [0,1] we again get a continuous real-valued function on [0,1], and if we multiply a real-valued continuous function defined on [0,1] by a real scalar we get another real-valued continuous function defined on [0,1]); this vector space is typically denoted by . When we have a vector space over a field it is often desirable to attach what is known as an inner-product to the vector space. The inner-product allows us to define geometric quantities on our space like lengths of vectors, angles between vectors, etc. The "dot product" you learned about in multivariable Calculus is an inner-product on for example. The (standard) inner product for is given by
Now, as to how all of this pertains to your question, if we take to be the constant function 1, the Cauchy Schwarz inequality says that
where equality holds above if and only if and are linearly dependent. According to the assumption you've provided we do have equality above. Hence, and are linearly dependent; i.e. We see that , so is constant.
Does this answer your question? Let me know if anything is unclear. Good luck!