I have essentially no knowledge of statistics, so this may be a well known topic.

Let be arbitrary.

What are the odds converges?

-Thanks all!

Printable View

- Dec 27th 2010, 07:36 PMchiph588@Randomized Convergence
I have essentially no knowledge of statistics, so this may be a well known topic.

Let be arbitrary.

What are the odds converges?

-Thanks all! - Dec 28th 2010, 12:01 AMCaptainBlack
Is that what you really want to ask? What you are asking appears to be:

does (or rather what is the probability that):

converges where the 's are sampled uniformly on the top half of the unit circle in the complex plane.

If that is what you mean, then the answer is probably (I will need to work out how to prove it but this is what I would place my money on): the real part converges with probability 1 and the imaginary part converges with probability 0.

The heuristic argument behind this is the the real part behaves on average like the alternating harmonic series, while the imaginary part behaves on average like the harmonic series.

CB - Dec 28th 2010, 12:05 AMchiph588@
- Dec 28th 2010, 12:27 AMCaptainBlack
Ahhh.. rereading this what I should have taken this to mean is

does (or rather what is the probability that):

converges where the 's are sampled uniformly on (that is take the values each with probability ).

Then the same heuristic would suggest that this converges with probability 1. I can show that the partial sums approach a random variable with zero mean and variance 2 (but this only guarantees that the sum becomes unbounded with probability 0, not that the sum converge)

CB - Dec 28th 2010, 01:39 AMMoo
Hello,

Considering CB's formulation : does the series , where , converge ? Yes it does and we can prove it using martingales. I'm sorry but for that, you will need some knowledge of probability :p

Consider the natural filtration and define .

It's easy to prove that is a -martingale, because the are independent with mean 0.

With this independence and mean 0, we can also write that

Hence is bounded in and from a martingale theorem, we deduce that it converges almost surely and in to a random variable , which is in .

So we get that converges almost surely (that is to say with probability 1). - Dec 29th 2010, 01:26 AMmatheagle
You can use the Khintchine-Kolmogorov Convergence Theorem instead.

- Dec 29th 2010, 07:55 AMchisigma
Let's define a random variable as...

(1)

... where the are discrete random variables with . Each has p.d.f. given by...

(2)

... and each has Fourier transform given by...

(3)

Setting the p.d.f of and its Fourier transform is...

(4)

Now [if it exists...] can be obtained as inverse Fourier Transform of ... but that requires some more efforts from me! (Thinking)...

Kind regards

- Dec 29th 2010, 08:58 AMMoo
- Dec 29th 2010, 12:21 PMchisigma
The following example will [I do hope...] clarify...

... let's define the random variable as...

(1)

... where the are discrete random variables with . Each has p.d.f. given by...

(2)

... and each has Fourier transform given by...

(3)

Setting the p.d.f of and its Fourier transform is...

(4)

Now it is well known the 'infinite product'...

(5)

... so that is...

(6)

... i.e. is uniformely distributed between -1 and +1... and that's not a surprise! (Wink)...

For the [very interesting...] question proposed by chip588@ we have to extablish if exists or not...

(7)

Kind regards