Characteristic Function (or Fourier Transform) of ANY probability distribution

First of all, I'm sorry if I don't know LaTex yet, but the equations are quite simple so they will be understandable.

Let x be a random real univariate variable described by a probability distribution d(x).

d(x) is generally a distribution, not a function, but we know it has a well defined measure on a sigma algebra and that it is always positive and the measure of the whole set is 1.

Does d(x) always have a Fourier Transform (or equivalently, a characteristic function)?

I think it does, but I'm not 100% sure because one might think of a not so well behaved probability distribution (distribution, not function) that might not be fourier transformable.

I think that we simply use the fact that, defining f(w) as E[exp(iwx)], we know from the basic property of an integral (or measure) that |f(w)|<=E[|exp(iwx)|]=E[1]=1

We can divide the range of x in a countable set of non-overlapping intervals that cover the whole range (x is defined over a sigma algebra, so we can do this) and evaluate f(w) as the sum of the the integrals in each of these intervals.

We therefore express f(w) as an infinite series S of complex numbers.

We know that each of these integrals has an absolute value that is not greater than the probability of its interval, the sum of the probabilities of these interval is limited (it is 1), so the series S converges absolutely so it must converge as well.

But can anyone find a counterexample of a weird probability distribution over which the series S doesn't converge? In general, does the property that the integral of exp(iwx)d(x) has an absolute value that isn't greater than the integral of d(x)?

I don't remember well the theory of measure, but I guess that since we know that d(x) is a probability distribution, it has a well behaved measure I guess...

Thank you all for your attention, if this result turns out to be always true it will be very useful in future proofs!