First, if a function is "infinitely differentiable", we can certainly form the Taylor's series for that function. But that is probably not the question you are really asking. You seem to be under to be under the impression that if a function has a Taylor's series then the Taylor series must be equal to the function- and that is not true. For one thing, it does not follow that the series converges. And even if the Taylor's series for a function converges for all x, it does not necessarily equal the function on any interval. For example, the function defined by if x is not 0, f(0)= 0, is infinitely differentiable at x= 0. (Any derivative is 0 at x= 0 and the nth derivative for x not 0 is an exponential with exponent a polynomial in 1/x. That goes to 0 as x goes to 0.) That is, its MacLaurin series (Taylor's series at x= 0) is identically equal to 0 but the function itself is not 0 except at x= 0.
If there exist an interval about x= a on which the Taylor's series for a function, f, converges and is equal to the function, we say that f is "analytic" at x= a (sometimes "real analytic" to distinguish it from "analytic" for functions of a complex variable which is equivalent but typically defined differently.)
How we show that a given function is "analytic" at a particular x= a is highly dependent on the precise function. For example, the simplest way to show that the Taylor's series for sine, cosine, and exponential, for example, are analytic is to show that the Taylor's series satisfy the same differential equation and "initial values" as the functions themselves. Then, by the "existance and uniqueness theorem" for initial value problems, they must be equal.