Results 1 to 15 of 15

Math Help - Conditional probability with poisson?

  1. #1
    Newbie
    Joined
    Dec 2009
    Posts
    6

    Conditional probability with poisson?

    I'm reviewing for a final exam, and I got stuck on one of the practice problems, and was hoping someone could point me in the right direction:

    The question is: given X = U + V, Y = V + W, where U,  V and W are independant Poissons with different means, find E[Y|X]

    So to do this, I figure I'd need to find f(y|x) = \frac{f(x,y)}{f(x)}, but I'm having trouble getting f(x,y).

    I have: f(x,y) = P(U+V=x,V+W=y), which I can use independance (of U & W) to separate and get P(U+V=x,V+W=y) = P(U = x-v, W = y-v) = P(U=x-v)P(W=y-v) but I think that would actually be f(x,y,v), and I'm not sure marginalizing over v to get f(x,y) is the right approach.

    Could someone tell me if this is the right direction to head in, or if there's a better/easier way to get f(y|x)?

    thanks!
    Follow Math Help Forum on Facebook and Google+

  2. #2
    Newbie
    Joined
    Nov 2009
    Posts
    7
    You are on the right track. You can now multiply the two independent poisson distributions. Then you must integrate out the variable you don't want, RV v. That will give you the F(x,y). From there you shouldn't have a problem.
    Follow Math Help Forum on Facebook and Google+

  3. #3
    Newbie
    Joined
    Dec 2009
    Posts
    6
    Thanks for the reply. How would go about getting rid of the v?

    If we let the mean of U be \lambda_{1} and mean of W be \lambda_{3} (where mean of V is \lambda_{2})

    then, I have:
    \sum_{v=0}^\infty{f(x,y,v)} which is: \sum_{v=0}^\infty{\frac{e^{\lambda_{1}}\lambda_{1}  ^{x-v}}{(x-v)!}\frac{e^{\lambda_{3}}\lambda_{3}^{y-v}}{(y-v)!}}

    and I'm kind of stuck here. I can group together (\lambda_{1}\lambda_{3})^{-v} and try to get the sum to 1 (using total probability of a Poisson with mean \lambda_{1}\lambda_{3} is 1), but the two factorials in the denominator seem to prevent me from getting a single v! on the bottom...
    Follow Math Help Forum on Facebook and Google+

  4. #4
    MHF Contributor matheagle's Avatar
    Joined
    Feb 2009
    Posts
    2,763
    Thanks
    5
    First of all this is discrete.
    You need to proceed in this manner, and drop the X,Y they are annoying.

     P(V+W=b|U+V=a)={P(V+W=b,U+V=a)\over  P(U+V=a)}

    The denominator is easy, it's a Poisson with mean \lambda_U+\lambda_V
    The numerator isn't that easy.
    --------------------------------------------------------
    OR maybe we just attack the expected value.

     E(V+W|U+V) =E(V|U+V)+E(W|U+V) =E(V|U+V)+E(W)

    Since W is independent of U,V, so the second one is easy, but the first is...

    So derive the distribution of V given U+V. I think that's been done a lot....

     P(V=b|U+V=a)={P(V=b,U+V=a)\over  P(U+V=a)}

    ={P(V=b)P(U=a-b)\over  P(U+V=a)}
    Last edited by matheagle; December 5th 2009 at 09:10 PM.
    Follow Math Help Forum on Facebook and Google+

  5. #5
    Junior Member
    Joined
    Dec 2009
    Posts
    26
    And then to get the expected value, you would have to sum that probability times b from b = 0 to a? Or is there an easier way to proceed? I can't get that sum to reduce to anything, and I'm pretty sure it does but maybe I'm just wasting my time.

    I'm also fairly sure we're in the same class, Ampersand. I feel like I might have some idea what I'm doing if the prof wasn't so terrible.
    Follow Math Help Forum on Facebook and Google+

  6. #6
    Newbie
    Joined
    Dec 2009
    Posts
    6
    Quote Originally Posted by matheagle View Post
    First of all this is discrete.
    You need to proceed in this manner, and drop the X,Y they are annoying.

     P(V+W=b|U+V=a)={P(V+W=b,U+V=a)\over  P(U+V=a)}

    The denominator is easy, it's a Poisson with mean \lambda_U+\lambda_V
    The numerator isn't that easy.
    --------------------------------------------------------
    OR maybe we just attack the expected value.

     E(V+W|U+V) =E(V|U+V)+E(W|U+V) =E(V|U+V)+E(W)

    Since W is independent of U,V, so the second one is easy, but the first is...

    So derive the distribution of V given U+V. I think that's been done a lot....

     P(V=b|U+V=a)={P(V=b,U+V=a)\over  P(U+V=a)}

    ={P(V=b)P(U=a-b)\over  P(U+V=a)}
    Thanks alot! Calculating it directly from expectations was alot easier, and I think I ended up with  E (Y|X) = X - \lambda_{1} + \lambda_{3} , which works out quite nicely when you take E[E(Y|X)] = \lambda_{2} + \lambda{3} as expected

    Quote Originally Posted by Beaky View Post
    And then to get the expected value, you would have to sum that probability times b from b = 0 to a? Or is there an easier way to proceed? I can't get that sum to reduce to anything, and I'm pretty sure it does but maybe I'm just wasting my time.

    I'm also fairly sure we're in the same class, Ampersand. I feel like I might have some idea what I'm doing if the prof wasn't so terrible.
    Yep, we're definitely in the same class then =p (your final's on Friday, right?)

    I'm not too happy with the teaching either...it seems he makes the stuff appear alot harder than it should be. I felt the lectures were pretty disorganized, and a bit unclear at times.

    have you gone through past exams?
    Follow Math Help Forum on Facebook and Google+

  7. #7
    MHF Contributor matheagle's Avatar
    Joined
    Feb 2009
    Posts
    2,763
    Thanks
    5
    Well, after grading my exams tonight,
    I'm sure my students feel the same way about me.
    and you should hit the THANKs button.
    Last edited by matheagle; December 6th 2009 at 07:37 PM.
    Follow Math Help Forum on Facebook and Google+

  8. #8
    Junior Member
    Joined
    Dec 2009
    Posts
    26
    Quote Originally Posted by ampersand View Post
    Thanks alot! Calculating it directly from expectations was alot easier, and I think I ended up with  E (Y|X) = X - \lambda_{1} + \lambda_{3} , which works out quite nicely when you take E[E(Y|X)] = \lambda_{2} + \lambda{3} as expected
    Could someone please provide some details as to how to get this, or at least tell me if I'm going about this the right way? The only way I know to solve this would be

    E(V|U+V=a)=\sum_{v=0}^{\infty}v*P(V=v|U+V=a)=\sum_  {v=0}^{a}v*P(V=v|U+V=a)

    which I can't seem to reduce.

    Yep, we're definitely in the same class then =p (your final's on Friday, right?)

    I'm not too happy with the teaching either...it seems he makes the stuff appear alot harder than it should be. I felt the lectures were pretty disorganized, and a bit unclear at times.

    have you gone through past exams?
    Yeah, final's on friday. This is the most pressure I've ever felt on an exam, since I'm sure I failed the last test and now need to pass this to pass the class. I've only looked at the old exams available on blackboard. If you know of others then please share.
    Follow Math Help Forum on Facebook and Google+

  9. #9
    Junior Member
    Joined
    Dec 2009
    Posts
    26
    I'm still really stuck on this, so any help would be much appreciated.

    So far, I've got

    E(V|U+V=a)=\sum_{b=0}^{\infty}b*P(V=b|U+V=a)=\sum_  {b=0}^{a}b*P(V=b|U+V=a)
    =\sum_{b=0}^{a}b*\frac{P(V=b)(P(U=a-b))}{P(U+V=a)}
    =\sum_{b=0}^{a}b*\frac{\frac{e^{-\lambda_{v}}\lambda_{v}^{b}e^{-\lambda_{u}}\lambda_{u}^{a-b}}{b!(a-b)!}}{\frac{e^{-<br />
\lambda_{v}-\lambda{u}}(\lambda_{v}+\lambda_{u})^{a}}{a!}}
    =\sum_{b=0}^{a}b*\frac{\lambda_{v}^{b}\lambda_{u}^  {a-b}a!}{(\lambda_{v}+\lambda_{u})^{a}b!(a-b)!}

    which looks a lot like a binomial expansion but I'm not sure it's so easily reduced. I also don't even think it's equivalent to what Ampersand got, which makes a lot more sense. I've spent hours checking for some silly mistake and can't find anything.
    Follow Math Help Forum on Facebook and Google+

  10. #10
    MHF Contributor matheagle's Avatar
    Joined
    Feb 2009
    Posts
    2,763
    Thanks
    5
    I MADE this comment for a reason............

    So derive the distribution of V given U+V. It has been done a lot....

     P(V=b|U+V=a)={P(V=b,U+V=a)\over  P(U+V=a)}

    ={P(V=b)P(U=a-b)\over  P(U+V=a)}

    --------------------------------------------------

    It's easy to prove that if W\sim P(\lambda_1) and Z\sim P(\lambda_2)

    then W|W+Z is a binomial.

    P(W=a|W+Z=a+b)=<br />
{{e^{-\lambda_1}\lambda_1^a\over a!}{e^{-\lambda_2}\lambda_2^b\over b!}\over {e^{-\lambda_1-\lambda_2}(\lambda_1+\lambda_2)^{(a+b)}\over (a+b)!}}

    ={a+b\choose a} \biggl({\lambda_1\over \lambda_1+\lambda_2}\biggr)^a\biggl({\lambda_2\ove  r \lambda_1+\lambda_2}\biggr)^b

    and the mean of a binomial is np.

    n=W+Z and p={\lambda_1\over \lambda_1+\lambda_2}
    Last edited by matheagle; December 8th 2009 at 03:44 PM. Reason: had my a's and b's backwards
    Follow Math Help Forum on Facebook and Google+

  11. #11
    Junior Member
    Joined
    Dec 2009
    Posts
    26
    Alright, thanks. I think I've got it now. I had used what was in your previous post in my last attempt, just in a different approach.

    The sum I had was actually correct if I had managed to reduce it, but I assumed Ampersand had the right answer and it wasn't matching up.
    Follow Math Help Forum on Facebook and Google+

  12. #12
    Newbie
    Joined
    Dec 2009
    Posts
    6
    Hmm, maybe I did it wrong then?

    I tackled it from expectations, but instead did:

    Let's fix X=x, where X=U+V, Y=V+W, so Y=x-U+W, and so E(Y|X=x)=E(x-U+W)=E(x)-E(U)+E(W)=x-\lambda_{1}+\lambda_{3} (since we fix x)

    so, if we consider E(Y|X) instead of E(Y|X=x), it becomes: E(Y|X) = X - \lambda_{1}+\lambda_{3}

    So, did I mis-step at the E(U)=\lambda{1} when it should have been E(U|X=x) = the binomial expectation (np)?

    And also, if we indeed had:
    E(Y|X=x) = x - \lambda_{1}+\lambda_{3}

    would we be allowed to just turn that into:

    E(Y|X) = X - \lambda_{1}+\lambda_{3}?
    Follow Math Help Forum on Facebook and Google+

  13. #13
    Junior Member
    Joined
    Dec 2009
    Posts
    26
    I'm not entirely positive, but I think you're right in that you can't do this:

    E(Y|X=x)=E(x-U+W)

    Because U is still dependent on X.

    From Matheagle's post, E(Y|X)=E(V+W|U+V) =E(V|U+V)+E(W|U+V) =E(V|U+V)+E(W)

    and P(V=b|U+V=x) has a binomial distribution, and so you can work out its mean to be x*\frac{\lambda_{v}}{\lambda_{v}+\lambda_{u}}

    And so E(V|U+V=x)+E(W)=x*\frac{\lambda_{v}}{\lambda_{v}+\  lambda_{u}}+\lambda_{w}

    Which makes some sense then since you still get

    E[E(V|U+V)]=(\lambda_{v}+\lambda_{u})*\frac{\lambda_{v}}{\lam  bda_{v}+\lambda_{u}}=\lambda_{v}

    Also, I think E(Y|X) is just a lazy way of writing E(Y|X=x). I would imagine that both ways you wrote out the answer would be acceptable, although it makes more sense to just write E(Y|X=x) to me.
    Follow Math Help Forum on Facebook and Google+

  14. #14
    MHF Contributor matheagle's Avatar
    Joined
    Feb 2009
    Posts
    2,763
    Thanks
    5
    Quote Originally Posted by Beaky View Post
    I'm not entirely positive, but I think you're right in that you can't do this:

    E(Y|X=x)=E(x-U+W)

    Because U is still dependent on X.

    From Matheagle's post, E(Y|X)=E(V+W|U+V) =E(V|U+V)+E(W|U+V) =E(V|U+V)+E(W)

    and P(V=b|U+V=x) has a binomial distribution, and so you can work out its mean to be x*\frac{\lambda_{v}}{\lambda_{v}+\lambda_{u}}

    And so E(V|U+V=x)+E(W)=x*\frac{\lambda_{v}}{\lambda_{v}+\  lambda_{u}}+\lambda_{w}

    Which makes some sense then since you still get

    E[E(V|U+V)]=(\lambda_{v}+\lambda_{u})*\frac{\lambda_{v}}{\lam  bda_{v}+\lambda_{u}}=\lambda_{v}

    Also, I think E(Y|X) is just a lazy way of writing E(Y|X=x). I would imagine that both ways you wrote out the answer would be acceptable, although it makes more sense to just write E(Y|X=x) to me.

    E(Y|X) is the same as E(Y|X=x).

    x is a just a realization of X, one of the possibilities.
    Capital X is the random variable, x is one of the possible outcomes.
    Follow Math Help Forum on Facebook and Google+

  15. #15
    Moo
    Moo is offline
    A Cute Angle Moo's Avatar
    Joined
    Mar 2008
    From
    P(I'm here)=1/3, P(I'm there)=t+1/3
    Posts
    5,618
    Thanks
    6
    E[Y|X=x]=\varphi(x) (function that you can calculate)

    E[Y|X]=\varphi(X) (function taken at X)
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. Conditional probability + Poisson question
    Posted in the Advanced Statistics Forum
    Replies: 4
    Last Post: December 10th 2011, 04:06 PM
  2. Normal + Poisson: conditional mgf
    Posted in the Advanced Statistics Forum
    Replies: 13
    Last Post: February 3rd 2011, 07:26 AM
  3. Conditional Poisson!
    Posted in the Advanced Statistics Forum
    Replies: 4
    Last Post: January 15th 2010, 11:06 AM
  4. Conditional Poisson
    Posted in the Advanced Statistics Forum
    Replies: 0
    Last Post: January 13th 2010, 08:17 AM
  5. Conditional poisson question
    Posted in the Advanced Statistics Forum
    Replies: 1
    Last Post: April 4th 2009, 03:08 AM

Search Tags


/mathhelpforum @mathhelpforum