# Thread: Conditional probability with poisson?

1. ## Conditional probability with poisson?

I'm reviewing for a final exam, and I got stuck on one of the practice problems, and was hoping someone could point me in the right direction:

The question is: given $X = U + V, Y = V + W$, where $U, V$ and $W$ are independant Poissons with different means, find $E[Y|X]$

So to do this, I figure I'd need to find $f(y|x) = \frac{f(x,y)}{f(x)}$, but I'm having trouble getting $f(x,y)$.

I have: $f(x,y) = P(U+V=x,V+W=y)$, which I can use independance (of U & W) to separate and get $P(U+V=x,V+W=y) = P(U = x-v, W = y-v) = P(U=x-v)P(W=y-v)$ but I think that would actually be $f(x,y,v)$, and I'm not sure marginalizing over v to get $f(x,y)$ is the right approach.

Could someone tell me if this is the right direction to head in, or if there's a better/easier way to get $f(y|x)$?

thanks!

2. You are on the right track. You can now multiply the two independent poisson distributions. Then you must integrate out the variable you don't want, RV v. That will give you the F(x,y). From there you shouldn't have a problem.

3. Thanks for the reply. How would go about getting rid of the v?

If we let the mean of U be $\lambda_{1}$ and mean of W be $\lambda_{3}$ (where mean of V is $\lambda_{2}$)

then, I have:
$\sum_{v=0}^\infty{f(x,y,v)}$ which is: $\sum_{v=0}^\infty{\frac{e^{\lambda_{1}}\lambda_{1} ^{x-v}}{(x-v)!}\frac{e^{\lambda_{3}}\lambda_{3}^{y-v}}{(y-v)!}}$

and I'm kind of stuck here. I can group together $(\lambda_{1}\lambda_{3})^{-v}$ and try to get the sum to 1 (using total probability of a Poisson with mean $\lambda_{1}\lambda_{3}$ is 1), but the two factorials in the denominator seem to prevent me from getting a single $v!$ on the bottom...

4. First of all this is discrete.
You need to proceed in this manner, and drop the X,Y they are annoying.

$P(V+W=b|U+V=a)={P(V+W=b,U+V=a)\over P(U+V=a)}$

The denominator is easy, it's a Poisson with mean $\lambda_U+\lambda_V$
The numerator isn't that easy.
--------------------------------------------------------
OR maybe we just attack the expected value.

$E(V+W|U+V) =E(V|U+V)+E(W|U+V) =E(V|U+V)+E(W)$

Since W is independent of U,V, so the second one is easy, but the first is...

So derive the distribution of V given U+V. I think that's been done a lot....

$P(V=b|U+V=a)={P(V=b,U+V=a)\over P(U+V=a)}$

$={P(V=b)P(U=a-b)\over P(U+V=a)}$

5. And then to get the expected value, you would have to sum that probability times b from b = 0 to a? Or is there an easier way to proceed? I can't get that sum to reduce to anything, and I'm pretty sure it does but maybe I'm just wasting my time.

I'm also fairly sure we're in the same class, Ampersand. I feel like I might have some idea what I'm doing if the prof wasn't so terrible.

6. Originally Posted by matheagle
First of all this is discrete.
You need to proceed in this manner, and drop the X,Y they are annoying.

$P(V+W=b|U+V=a)={P(V+W=b,U+V=a)\over P(U+V=a)}$

The denominator is easy, it's a Poisson with mean $\lambda_U+\lambda_V$
The numerator isn't that easy.
--------------------------------------------------------
OR maybe we just attack the expected value.

$E(V+W|U+V) =E(V|U+V)+E(W|U+V) =E(V|U+V)+E(W)$

Since W is independent of U,V, so the second one is easy, but the first is...

So derive the distribution of V given U+V. I think that's been done a lot....

$P(V=b|U+V=a)={P(V=b,U+V=a)\over P(U+V=a)}$

$={P(V=b)P(U=a-b)\over P(U+V=a)}$
Thanks alot! Calculating it directly from expectations was alot easier, and I think I ended up with $E (Y|X) = X - \lambda_{1} + \lambda_{3}$, which works out quite nicely when you take $E[E(Y|X)] = \lambda_{2} + \lambda{3}$ as expected

Originally Posted by Beaky
And then to get the expected value, you would have to sum that probability times b from b = 0 to a? Or is there an easier way to proceed? I can't get that sum to reduce to anything, and I'm pretty sure it does but maybe I'm just wasting my time.

I'm also fairly sure we're in the same class, Ampersand. I feel like I might have some idea what I'm doing if the prof wasn't so terrible.
Yep, we're definitely in the same class then =p (your final's on Friday, right?)

I'm not too happy with the teaching either...it seems he makes the stuff appear alot harder than it should be. I felt the lectures were pretty disorganized, and a bit unclear at times.

have you gone through past exams?

7. Well, after grading my exams tonight,
I'm sure my students feel the same way about me.
and you should hit the THANKs button.

8. Originally Posted by ampersand
Thanks alot! Calculating it directly from expectations was alot easier, and I think I ended up with $E (Y|X) = X - \lambda_{1} + \lambda_{3}$, which works out quite nicely when you take $E[E(Y|X)] = \lambda_{2} + \lambda{3}$ as expected
Could someone please provide some details as to how to get this, or at least tell me if I'm going about this the right way? The only way I know to solve this would be

$E(V|U+V=a)=\sum_{v=0}^{\infty}v*P(V=v|U+V=a)=\sum_ {v=0}^{a}v*P(V=v|U+V=a)$

which I can't seem to reduce.

Yep, we're definitely in the same class then =p (your final's on Friday, right?)

I'm not too happy with the teaching either...it seems he makes the stuff appear alot harder than it should be. I felt the lectures were pretty disorganized, and a bit unclear at times.

have you gone through past exams?
Yeah, final's on friday. This is the most pressure I've ever felt on an exam, since I'm sure I failed the last test and now need to pass this to pass the class. I've only looked at the old exams available on blackboard. If you know of others then please share.

9. I'm still really stuck on this, so any help would be much appreciated.

So far, I've got

$E(V|U+V=a)=\sum_{b=0}^{\infty}b*P(V=b|U+V=a)=\sum_ {b=0}^{a}b*P(V=b|U+V=a)$
$=\sum_{b=0}^{a}b*\frac{P(V=b)(P(U=a-b))}{P(U+V=a)}$
$=\sum_{b=0}^{a}b*\frac{\frac{e^{-\lambda_{v}}\lambda_{v}^{b}e^{-\lambda_{u}}\lambda_{u}^{a-b}}{b!(a-b)!}}{\frac{e^{-
\lambda_{v}-\lambda{u}}(\lambda_{v}+\lambda_{u})^{a}}{a!}}$

$=\sum_{b=0}^{a}b*\frac{\lambda_{v}^{b}\lambda_{u}^ {a-b}a!}{(\lambda_{v}+\lambda_{u})^{a}b!(a-b)!}$

which looks a lot like a binomial expansion but I'm not sure it's so easily reduced. I also don't even think it's equivalent to what Ampersand got, which makes a lot more sense. I've spent hours checking for some silly mistake and can't find anything.

10. I MADE this comment for a reason............

So derive the distribution of V given U+V. It has been done a lot....

$P(V=b|U+V=a)={P(V=b,U+V=a)\over P(U+V=a)}$

$={P(V=b)P(U=a-b)\over P(U+V=a)}$

--------------------------------------------------

It's easy to prove that if $W\sim P(\lambda_1)$ and $Z\sim P(\lambda_2)$

then $W|W+Z$ is a binomial.

$P(W=a|W+Z=a+b)=
{{e^{-\lambda_1}\lambda_1^a\over a!}{e^{-\lambda_2}\lambda_2^b\over b!}\over {e^{-\lambda_1-\lambda_2}(\lambda_1+\lambda_2)^{(a+b)}\over (a+b)!}}$

$={a+b\choose a} \biggl({\lambda_1\over \lambda_1+\lambda_2}\biggr)^a\biggl({\lambda_2\ove r \lambda_1+\lambda_2}\biggr)^b$

and the mean of a binomial is np.

n=W+Z and $p={\lambda_1\over \lambda_1+\lambda_2}$

11. Alright, thanks. I think I've got it now. I had used what was in your previous post in my last attempt, just in a different approach.

The sum I had was actually correct if I had managed to reduce it, but I assumed Ampersand had the right answer and it wasn't matching up.

12. Hmm, maybe I did it wrong then?

I tackled it from expectations, but instead did:

Let's fix $X=x$, where $X=U+V, Y=V+W$, so $Y=x-U+W$, and so $E(Y|X=x)=E(x-U+W)=E(x)-E(U)+E(W)=x-\lambda_{1}+\lambda_{3}$ (since we fix x)

so, if we consider $E(Y|X)$ instead of $E(Y|X=x)$, it becomes: $E(Y|X) = X - \lambda_{1}+\lambda_{3}$

So, did I mis-step at the $E(U)=\lambda{1}$ when it should have been $E(U|X=x) =$ the binomial expectation (np)?

And also, if we indeed had:
$E(Y|X=x) = x - \lambda_{1}+\lambda_{3}$

would we be allowed to just turn that into:

$E(Y|X) = X - \lambda_{1}+\lambda_{3}$?

13. I'm not entirely positive, but I think you're right in that you can't do this:

$E(Y|X=x)=E(x-U+W)$

Because U is still dependent on X.

From Matheagle's post, $E(Y|X)=E(V+W|U+V) =E(V|U+V)+E(W|U+V) =E(V|U+V)+E(W)$

and $P(V=b|U+V=x)$ has a binomial distribution, and so you can work out its mean to be $x*\frac{\lambda_{v}}{\lambda_{v}+\lambda_{u}}$

And so $E(V|U+V=x)+E(W)=x*\frac{\lambda_{v}}{\lambda_{v}+\ lambda_{u}}+\lambda_{w}$

Which makes some sense then since you still get

$E[E(V|U+V)]=(\lambda_{v}+\lambda_{u})*\frac{\lambda_{v}}{\lam bda_{v}+\lambda_{u}}=\lambda_{v}$

Also, I think $E(Y|X)$ is just a lazy way of writing $E(Y|X=x)$. I would imagine that both ways you wrote out the answer would be acceptable, although it makes more sense to just write $E(Y|X=x)$ to me.

14. Originally Posted by Beaky
I'm not entirely positive, but I think you're right in that you can't do this:

$E(Y|X=x)=E(x-U+W)$

Because U is still dependent on X.

From Matheagle's post, $E(Y|X)=E(V+W|U+V) =E(V|U+V)+E(W|U+V) =E(V|U+V)+E(W)$

and $P(V=b|U+V=x)$ has a binomial distribution, and so you can work out its mean to be $x*\frac{\lambda_{v}}{\lambda_{v}+\lambda_{u}}$

And so $E(V|U+V=x)+E(W)=x*\frac{\lambda_{v}}{\lambda_{v}+\ lambda_{u}}+\lambda_{w}$

Which makes some sense then since you still get

$E[E(V|U+V)]=(\lambda_{v}+\lambda_{u})*\frac{\lambda_{v}}{\lam bda_{v}+\lambda_{u}}=\lambda_{v}$

Also, I think $E(Y|X)$ is just a lazy way of writing $E(Y|X=x)$. I would imagine that both ways you wrote out the answer would be acceptable, although it makes more sense to just write $E(Y|X=x)$ to me.

$E(Y|X)$ is the same as $E(Y|X=x)$.

x is a just a realization of X, one of the possibilities.
Capital X is the random variable, x is one of the possible outcomes.

15. $E[Y|X=x]=\varphi(x)$ (function that you can calculate)

$E[Y|X]=\varphi(X)$ (function taken at X)