I won't claim that I have studied characteristic functions in depth but my take on your question is as follows.
Basically, the characteristic function provides an equivalent representation of a distribution. We can specify a distribution either by its cumulative distribution function, its probability density function, all its moments, its characteristic function, etc. All these are equivalent representations of a distribution.
Just like if we have piece of signal, taking its Fourier transform gives us its equivalent representation in the frequency domain. It's a 1-to-1 mapping. But of course I don't think it's enlightening at all to think about frequency of a distribution.
So, my view on the characteristic function is that it is the sum of all the moments of the distribution. This can be observed by taking the power series expansion:
Each of the moments can be extracted from the characteristic function through differentiation.