Results 1 to 11 of 11

Math Help - Sum of i.i .d. r.v.

  1. #1
    Member mabruka's Avatar
    Joined
    Jan 2010
    From
    Mexico City
    Posts
    150

    Sum of i.i .d. r.v.

    If \xi_1,\ldots,\xi_n are integrable indp. i.d. r.v. show that

    E\left[\xi_1 | \xi_1+\cdots+\xi_n\right] =\frac{\xi_1+\cdots+\xi_n}{n}



    Any help would be much apreciated, i can post what i've done so far and where i am stuck, would that help?
    Follow Math Help Forum on Facebook and Google+

  2. #2
    Moo
    Moo is offline
    A Cute Angle Moo's Avatar
    Joined
    Mar 2008
    From
    P(I'm here)=1/3, P(I'm there)=t+1/3
    Posts
    5,618
    Thanks
    6
    Hello,

    This is a very common problem. I'm copying it from something I typed before, so don't worry if the \xi are transformed into X.

    (X_1,\dots,X_n)\stackrel{distribution}{=} (X_{\sigma(1)},\dots,X_{\sigma(n)}) ~,~ \forall \sigma\in\mathfrak{S}_n (set of permutations) because the random variables are iid.

    Considering the permutation \sigma that inverts 1 and k for some k, we obtain that Y_k=(X_k,X_2,\dots,X_{k-1},X_1,X_{k+1},\dots,X_n) has the same distribution as (X_1,\dots,X_n).
    So \forall k=1,\dots,n, the Y_k follow the same distribution.

    Now consider the function :

    \begin{aligned} f ~:~ & \mathbb{R}^n &\to &\mathbb{R}^2 \\ & (x_1,\dots,x_n) & \mapsto & (x_1,\sum_{i=1}^n x_i) \end{aligned}

    and apply it to the vectors Y_k.

    It follows that (X_k,S_n) are identically distributed for all k.


    In particular, it follows that \forall j,k, ~ E[X_j|S_n]=E[X_k|S_n].

    Hence S_n=E[S_n|S_n]=E\left[\sum_{k=1}^n X_k\bigg|S_n\right]=\sum_{k=1}^n E[X_k|S_n]=n E[X_1|S_n]
    Follow Math Help Forum on Facebook and Google+

  3. #3
    Member mabruka's Avatar
    Joined
    Jan 2010
    From
    Mexico City
    Posts
    150
    Wow what an interesting approach there!

    The only thing that remains unclear to me (maybe i am forgetting a fundamental result) is why using that function on Y_k yields that (X_k, S_n) are i.d. for all k?



    Is it true that if Z and W are r.v. i.d. then f(Z) is i.d. to f(W)
    for any function f:\mathbb R^n \longrightarrow \mathbb R^m ? (does this function needs to be continuous? or maybe just measurable? )

    Also, where is the independence hypothesis being used implicitly?
    thank you
    Last edited by mabruka; March 28th 2010 at 05:42 PM.
    Follow Math Help Forum on Facebook and Google+

  4. #4
    Member mabruka's Avatar
    Joined
    Jan 2010
    From
    Mexico City
    Posts
    150
    Oh well i think this answer my question:

    Z , Y are i.d. iff E f(Z) = E f(Y) for every borel measurable f




    So the only thing it has yet to be proved is that the function

    \begin{aligned} f ~:~ & \mathbb{R}^n &\to &\mathbb{R}^2 \\  & (x_1,\dots,x_n) & \mapsto & (x_1,\sum_{i=1}^n x_i)  \end{aligned}

    is measurable

    right?

    (i am betting it is continuous..)
    Follow Math Help Forum on Facebook and Google+

  5. #5
    MHF Contributor

    Joined
    Aug 2008
    From
    Paris, France
    Posts
    1,174
    Quote Originally Posted by Moo View Post
    This is a very common problem. I'm copying it from something I typed before, so don't worry if the \xi are transformed into X.
    That's a nice post! Actually I wouldn't have bothered going into such details... I would have just said: by symmetry, (X_1,S_n),\ldots,(X_n,S_n) have same distribution.

    Or: Let f be a bounded measurable function. We have, by symmetry, E[X_1f(S_n)]=\cdots=E[X_n f(S_n)], hence E[X_1 f(S_n)]=\frac{1}{n}E[(X_1+\cdots+X_n)f(S_n)]=E[\frac{S_n}{n}f(S_n)]. Since \frac{S_n}{n} is a measurable function of S_n, we have checked the definition of the conditional expectation: E[X_1|S_n]=\frac{S_n}{n}.
    Follow Math Help Forum on Facebook and Google+

  6. #6
    Member mabruka's Avatar
    Joined
    Jan 2010
    From
    Mexico City
    Posts
    150
    In particular, it follows that \forall j,k, ~ E[X_j|S_n]=E[X_k|S_n].
    Why does this follows?


    Thank you
    Last edited by mabruka; March 28th 2010 at 09:44 PM.
    Follow Math Help Forum on Facebook and Google+

  7. #7
    Moo
    Moo is offline
    A Cute Angle Moo's Avatar
    Joined
    Mar 2008
    From
    P(I'm here)=1/3, P(I'm there)=t+1/3
    Posts
    5,618
    Thanks
    6
    Quote Originally Posted by mabruka View Post
    Wow what an interesting approach there!

    The only thing that remains unclear to me (maybe i am forgetting a fundamental result) is why using that function on Y_k yields that (X_k, S_n) are i.d. for all k?
    Because the Y_k follow the same distribution for all k ! So if you apply the function f to each of them, the results (X_k,S_n) will all have the same distribution.

    Is it true that if Z and W are r.v. i.d. then f(Z) is i.d. to f(W)
    for any function f:\mathbb R^n \longrightarrow \mathbb R^m ? (does this function needs to be continuous? or maybe just measurable? )
    Also, where is the independence hypothesis being used implicitly?
    thank you[/QUOTE]

    Quote Originally Posted by mabruka View Post
    Oh well i think this answer my question:

    Z , Y are i.d. iff E f(Z) = E f(Y) for every borel measurable f
    Not quite, it's rather that for every bounded measurable h, E(h(Z)=E(h(Y)), and then let h=gof, where g is any bounded measurable function. And f, defined as above.

    So the only thing it has yet to be proved is that the function

    \begin{aligned} f ~:~ & \mathbb{R}^n &\to &\mathbb{R}^2 \\  & (x_1,\dots,x_n) & \mapsto & (x_1,\sum_{i=1}^n x_i)  \end{aligned}

    is measurable

    right?
    Well yeah, it has to be done...
    Just recall that the projections (the functions that return the i-th coordinate) are measurable, and so is the sum.

    Why does this follows?
    Read Laurent's answer about this.
    There's a sort of equivalence between having a conditional expectation X|Z, than having the expectation of X multiplied by any function of Z.
    Follow Math Help Forum on Facebook and Google+

  8. #8
    Moo
    Moo is offline
    A Cute Angle Moo's Avatar
    Joined
    Mar 2008
    From
    P(I'm here)=1/3, P(I'm there)=t+1/3
    Posts
    5,618
    Thanks
    6
    Actually I wouldn't have bothered going into such details...
    Lol, the story is that on les-maths.net, someone was talking about the probably little number of teachers who did it the rigorous way.
    So I copied what my own teacher did on that forum, and then here (the second time is always less painful) :P
    Follow Math Help Forum on Facebook and Google+

  9. #9
    Member mabruka's Avatar
    Joined
    Jan 2010
    From
    Mexico City
    Posts
    150
    True! the measurability of f is not a problem now =)

    Quote:
    Why does this follows?
    Read Laurent's answer about this.
    There's a sort of equivalence between having a conditional expectation X|Z, than having the expectation of X multiplied by any function of Z.
    What definition of conditional expected value is he using?

    What i know is that if we check that

    E(\frac{S_n}{n}1_A) = E(X_11_A) for every A\in \sigma(S_n)

    then it follows what we need: E[X_1|S_n]=\frac{S_n}{n}

    Alternatively in order to get that E[X_k|S_n]=E[X_j|S_n] , it suffices to show that

    E(X_k 1_A)=E(X_j 1_A) for every A\in\sigma(S_n) which to tell you the truth seems a little bit hard !


    This last step is what i am not convinced yet.

    thank you


    Sorry if i am being stubborn but as you wrote earlier usually teachers dont give rigorous proofs but in order for one to learn something one must know the do's and dont's
    Last edited by mabruka; March 29th 2010 at 02:38 PM.
    Follow Math Help Forum on Facebook and Google+

  10. #10
    MHF Contributor

    Joined
    Aug 2008
    From
    Paris, France
    Posts
    1,174
    Quote Originally Posted by mabruka View Post
    What definition of conditional expected value is he using?

    What i know is that if we check that

    E(\frac{S_n}{n}1_A) = E(X_11_A) for every A\in \sigma(S_n)

    then it follows what we need: E[X_1|S_n]=\frac{S_n}{n}

    Alternatively in order to get that E[X_k|S_n]=E[X_j|S_n] , it suffices to show that

    E(X_k 1_A)=E(X_j 1_A) for every A\in\sigma(S_n) which to tell you the truth seems a little bit hard !
    Read my post again by replacing f(S_n) by {\bf 1}_{\{S_n\in B\}}, where B is any measurable subset of \mathbb{R}. Indeed, an event A\in\sigma(S_n) can be written A=\{S_n\in B\} where B is a measurable subset of \mathbb{R}.
    Follow Math Help Forum on Facebook and Google+

  11. #11
    Member mabruka's Avatar
    Joined
    Jan 2010
    From
    Mexico City
    Posts
    150
    Thank you it is all clear now !


    Follow Math Help Forum on Facebook and Google+

Search Tags


/mathhelpforum @mathhelpforum