# Characteristic Function (or Fourier Transform) of ANY probability distribution

• January 23rd 2010, 05:52 AM
rargh
Characteristic Function (or Fourier Transform) of ANY probability distribution
First of all, I'm sorry if I don't know LaTex yet, but the equations are quite simple so they will be understandable.

Let x be a random real univariate variable described by a probability distribution d(x).

d(x) is generally a distribution, not a function, but we know it has a well defined measure on a sigma algebra and that it is always positive and the measure of the whole set is 1.

Does d(x) always have a Fourier Transform (or equivalently, a characteristic function)?

I think it does, but I'm not 100% sure because one might think of a not so well behaved probability distribution (distribution, not function) that might not be fourier transformable.

I think that we simply use the fact that, defining f(w) as E[exp(iwx)], we know from the basic property of an integral (or measure) that |f(w)|<=E[|exp(iwx)|]=E[1]=1

We can divide the range of x in a countable set of non-overlapping intervals that cover the whole range (x is defined over a sigma algebra, so we can do this) and evaluate f(w) as the sum of the the integrals in each of these intervals.

We therefore express f(w) as an infinite series S of complex numbers.

We know that each of these integrals has an absolute value that is not greater than the probability of its interval, the sum of the probabilities of these interval is limited (it is 1), so the series S converges absolutely so it must converge as well.

But can anyone find a counterexample of a weird probability distribution over which the series S doesn't converge? In general, does the property that the integral of exp(iwx)d(x) has an absolute value that isn't greater than the integral of d(x)?

I don't remember well the theory of measure, but I guess that since we know that d(x) is a probability distribution, it has a well behaved measure I guess...

Thank you all for your attention, if this result turns out to be always true it will be very useful in future proofs!
• January 23rd 2010, 08:55 AM
Laurent
Quote:

Originally Posted by rargh
Let x be a random real univariate variable described by a probability distribution d(x).

d(x) is generally a distribution, not a function, but we know it has a well defined measure on a sigma algebra and that it is always positive and the measure of the whole set is 1.

Does d(x) always have a Fourier Transform (or equivalently, a characteristic function)?

I think it does, but I'm not 100% sure because one might think of a not so well behaved probability distribution (distribution, not function) that might not be fourier transformable.

I think that we simply use the fact that, defining f(w) as E[exp(iwx)], we know from the basic property of an integral (or measure) that |f(w)|<=E[|exp(iwx)|]=E[1]=1

This last sentence is indeed the proof that any probability measure has a Fourier transform (a very useful fact in probability theory). More precisely, the proof is: since almost-surely $|e^{i\omega X}|\leq 1$, and 1 is integrable (with respect to $P$), the random variable $e^{i\omega X}$ is integrable, so that we can define $f(\omega)=E[e^{i\omega X}]$; this function is the Fourier transform of the law of $X$. (Or equivalently you can say that $|e^{i\omega x}|\leq 1$ for all $x$, and 1 is integrable with respect to $d(X)$, hence we can define $f(\omega)=\int e^{i\omega x} (d(X))(dx)$ (integration with respect to $d(X)$))

I didn't understand what you wrote afterward. The above proof is finished anyway.
• January 24th 2010, 05:41 AM
rargh
Great, thanks! Sorry if the last part wasn't so clear...