A man tossing a coin wins one point for heads and five points for tails. The game stops when the man accumulates at least 1000 points.
How do you estimate with accuracy the expectation of the length of the game?
A first observation is that the game necessarily stops between 200 and 1000 tosses. Those numbers being "large" enough, the expectation being 3 and the variance 4, e.g. finite, the Law of large Numbers tells us that a first estimate of the average length is 334.
But obviously it is possible to do better than that. (sub)Martingale theory?
Thanks for your help.
Wald's identity being based on the Optional Sampling Theorem, on can prove the result directly from it using the fact is a martingale ( being the random variable describing each toss).
Thanks for your answer.
I have one last question about this problem though: how do you prove that and the are independent?
This is needed to use Wald's equation, and although it might seem obvious to some, I was wondering if you had some fully convincing arguments.
Thanks in advance.
Devil is always in details.
I don't know how it could be even remotely obvious... They're obviously not independent on the contrary, and don't need to be; only needs to be a stopping time relative to the usual filtration relative to
Originally Posted by akbar
Agree. I am asking this as it is "obviously" a requirement for the proof of Wald's equation, as it is shown on wikipedia:
Wald's equation - Wikipedia, the free encyclopedia
On the second proof, there is the requirement of for . So it makes sense in that case.
Note this condition of independence is not mentioned in the PlanetMath version of the result where the wiki article is taken from.
So I hope you understand my need of an expert's view on the matter.
1. The wikipedia page states a very simple result (proof: ), this is not the "full" Wald's equation; the version you need here is when is a stopping time, that's the one I quoted.
Originally Posted by akbar
2. Of course the PlanetMath page is wrong, and doesn't even make sense: what does it mean that are i.i.d. when is a random variable itself?
3. Your first answer suggested you knew the optional stopping theorem: then you have one proof with no independence needed between and the 's. This proof works in your case since is bounded, and the steps as well.
4. In order to get the general result I mentioned in my first post, you would need to avoid the optional stopping theorem (which requires for instance a uniform integrability of ) and write a direct (very short) proof:
and, since T is a stopping time, depends on and is independent of , hence:
...as simple as that. (For full rigour, I should first prove that , which is obtained via the exact same lines plus the triangular inequality)