Results 1 to 7 of 7

Thread: Tossing game

  1. #1
    Junior Member
    Joined
    Nov 2009
    Posts
    57

    Tossing game

    A man tossing a coin wins one point for heads and five points for tails. The game stops when the man accumulates at least 1000 points.
    How do you estimate with accuracy $\displaystyle \pm 2$ the expectation of the length of the game?

    A first observation is that the game necessarily stops between 200 and 1000 tosses. Those numbers being "large" enough, the expectation being 3 and the variance 4, e.g. finite, the Law of large Numbers tells us that a first estimate of the average length is 334.
    But obviously it is possible to do better than that. (sub)Martingale theory?

    Thanks for your help.
    Last edited by akbar; Jan 8th 2010 at 06:27 AM.
    Follow Math Help Forum on Facebook and Google+

  2. #2
    MHF Contributor

    Joined
    Aug 2008
    From
    Paris, France
    Posts
    1,174
    Quote Originally Posted by akbar View Post
    A man tossing a coin wins one point for heads and five points for tails. The game stops when the man accumulates at least 1000 points.
    How do you estimate with accuracy $\displaystyle \pm 2$ the expectation of the length of the game?
    This is a typical application of Wald's identity: if $\displaystyle X_1,X_2,\ldots$ are i.i.d. integrable random variables and $\displaystyle T$ is an integrable stopping time, then letting $\displaystyle S_n=X_1+\cdots+X_n$, we have:

    $\displaystyle E[S_T]=E[X_1]E[T]$.
    (This generalizes to martingales; in fact, either you prove it directly in this case or you use a general martingale theorem for the (bounded) martingale $\displaystyle M_n=S_{n\wedge T}-(n\wedge T)E[X_1]$)

    In your case, you don't know $\displaystyle E[S_T]$ where $\displaystyle T=\inf\{i|S_i\geq 1000\}$, but you know it is between 1000 and 1004.
    Follow Math Help Forum on Facebook and Google+

  3. #3
    Junior Member
    Joined
    Nov 2009
    Posts
    57
    Wald's identity being based on the Optional Sampling Theorem, on can prove the result directly from it using the fact $\displaystyle Y_n=\sum_1^n X_k-3n$ is a martingale ($\displaystyle X_n$ being the random variable describing each toss).
    Thanks for your answer.
    Last edited by akbar; Jan 8th 2010 at 08:07 AM.
    Follow Math Help Forum on Facebook and Google+

  4. #4
    Junior Member
    Joined
    Nov 2009
    Posts
    57
    I have one last question about this problem though: how do you prove that $\displaystyle T$ and the $\displaystyle X_i$ are independent?
    This is needed to use Wald's equation, and although it might seem obvious to some, I was wondering if you had some fully convincing arguments.
    Thanks in advance.

    Devil is always in details.
    Follow Math Help Forum on Facebook and Google+

  5. #5
    MHF Contributor

    Joined
    Aug 2008
    From
    Paris, France
    Posts
    1,174
    Quote Originally Posted by akbar View Post
    I have one last question about this problem though: how do you prove that $\displaystyle T$ and the $\displaystyle X_i$ are independent?
    This is needed to use Wald's equation, and although it might seem obvious to some, I was wondering if you had some fully convincing arguments.
    I don't know how it could be even remotely obvious... They're obviously not independent on the contrary, and don't need to be; $\displaystyle T$ only needs to be a stopping time relative to the usual filtration relative to $\displaystyle X_1,X_2,\ldots$
    Follow Math Help Forum on Facebook and Google+

  6. #6
    Junior Member
    Joined
    Nov 2009
    Posts
    57
    Agree. I am asking this as it is "obviously" a requirement for the proof of Wald's equation, as it is shown on wikipedia:

    Wald's equation - Wikipedia, the free encyclopedia

    On the second proof, there is the requirement of $\displaystyle \mathbb{E}(X_i|T=t)=E(X_i)$ for $\displaystyle i\leq t$. So it makes sense in that case.

    Note this condition of independence is not mentioned in the PlanetMath version of the result where the wiki article is taken from.
    So I hope you understand my need of an expert's view on the matter.
    Cordialement.
    Follow Math Help Forum on Facebook and Google+

  7. #7
    MHF Contributor

    Joined
    Aug 2008
    From
    Paris, France
    Posts
    1,174
    Quote Originally Posted by akbar View Post
    Agree. I am asking this as it is "obviously" a requirement for the proof of Wald's equation, as it is shown on wikipedia:

    Wald's equation - Wikipedia, the free encyclopedia

    On the second proof, there is the requirement of $\displaystyle \mathbb{E}(X_i|T=t)=E(X_i)$ for $\displaystyle i\leq t$. So it makes sense in that case.

    Note this condition of independence is not mentioned in the PlanetMath version of the result where the wiki article is taken from.
    So I hope you understand my need of an expert's view on the matter.
    Cordialement.
    1. The wikipedia page states a very simple result (proof: $\displaystyle E[X_1+\cdots+X_T]=E[E[X_1+\cdots+X_T|T]]=E[T E[X_1]]=E[T]E[X_1]$), this is not the "full" Wald's equation; the version you need here is when $\displaystyle T$ is a stopping time, that's the one I quoted.

    2. Of course the PlanetMath page is wrong, and doesn't even make sense: what does it mean that $\displaystyle X_1,\ldots,X_N$ are i.i.d. when $\displaystyle N$ is a random variable itself?

    3. Your first answer suggested you knew the optional stopping theorem: then you have one proof with no independence needed between $\displaystyle T$ and the $\displaystyle X_i$'s. This proof works in your case since $\displaystyle T$ is bounded, and the steps $\displaystyle X_i$ as well.

    4. In order to get the general result I mentioned in my first post, you would need to avoid the optional stopping theorem (which requires for instance a uniform integrability of $\displaystyle (S_{n\wedge T})_{n\geq 0}$) and write a direct (very short) proof:

    $\displaystyle E[X_1+\cdots+X_T]=\sum_{n=1}^\infty E[X_n {\bf 1}_{\{T\geq n\}}]$
    and, since T is a stopping time, $\displaystyle \{T\geq n\}(=\{T\leq n-1\}^c)$ depends on $\displaystyle X_1,\ldots,X_{n-1}$ and is independent of $\displaystyle X_n$, hence:

    $\displaystyle E[X_1+\cdots+X_T]=\sum_{n=1}^\infty E[X_n]P(T\geq n)=E[X_1]\sum_{n=1}^\infty P(T\geq n)$ $\displaystyle =E[X_1]E[T]$.
    ...as simple as that. (For full rigour, I should first prove that $\displaystyle E[|X_1+\cdots+X_T|]<\infty$, which is obtained via the exact same lines plus the triangular inequality)
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. Game theory: Are there any other equilibria in this game?
    Posted in the Advanced Math Topics Forum
    Replies: 0
    Last Post: Mar 7th 2012, 06:45 PM
  2. Game Theory: Hotelling game with 3 players
    Posted in the Advanced Applied Math Forum
    Replies: 2
    Last Post: Mar 26th 2011, 02:08 AM
  3. coin tossing
    Posted in the Statistics Forum
    Replies: 8
    Last Post: Jan 30th 2010, 02:08 PM
  4. [game theory] Election game
    Posted in the Advanced Math Topics Forum
    Replies: 0
    Last Post: Apr 22nd 2009, 07:32 AM
  5. Sequential-form game (game tree diagram)
    Posted in the Advanced Statistics Forum
    Replies: 0
    Last Post: Oct 5th 2008, 06:51 PM

Search Tags


/mathhelpforum @mathhelpforum