# Conditional probability with poisson?

• December 5th 2009, 04:22 PM
ampersand
Conditional probability with poisson?
I'm reviewing for a final exam, and I got stuck on one of the practice problems, and was hoping someone could point me in the right direction:

The question is: given $X = U + V, Y = V + W$, where $U, V$ and $W$ are independant Poissons with different means, find $E[Y|X]$

So to do this, I figure I'd need to find $f(y|x) = \frac{f(x,y)}{f(x)}$, but I'm having trouble getting $f(x,y)$.

I have: $f(x,y) = P(U+V=x,V+W=y)$, which I can use independance (of U & W) to separate and get $P(U+V=x,V+W=y) = P(U = x-v, W = y-v) = P(U=x-v)P(W=y-v)$ but I think that would actually be $f(x,y,v)$, and I'm not sure marginalizing over v to get $f(x,y)$ is the right approach.

Could someone tell me if this is the right direction to head in, or if there's a better/easier way to get $f(y|x)$?

thanks!
• December 5th 2009, 05:48 PM
StatRookie
You are on the right track. You can now multiply the two independent poisson distributions. Then you must integrate out the variable you don't want, RV v. That will give you the F(x,y). From there you shouldn't have a problem.
• December 5th 2009, 06:31 PM
ampersand
Thanks for the reply. How would go about getting rid of the v?

If we let the mean of U be $\lambda_{1}$ and mean of W be $\lambda_{3}$ (where mean of V is $\lambda_{2}$)

then, I have:
$\sum_{v=0}^\infty{f(x,y,v)}$ which is: $\sum_{v=0}^\infty{\frac{e^{\lambda_{1}}\lambda_{1} ^{x-v}}{(x-v)!}\frac{e^{\lambda_{3}}\lambda_{3}^{y-v}}{(y-v)!}}$

and I'm kind of stuck here. I can group together $(\lambda_{1}\lambda_{3})^{-v}$ and try to get the sum to 1 (using total probability of a Poisson with mean $\lambda_{1}\lambda_{3}$ is 1), but the two factorials in the denominator seem to prevent me from getting a single $v!$ on the bottom...
• December 5th 2009, 09:57 PM
matheagle
First of all this is discrete.
You need to proceed in this manner, and drop the X,Y they are annoying.

$P(V+W=b|U+V=a)={P(V+W=b,U+V=a)\over P(U+V=a)}$

The denominator is easy, it's a Poisson with mean $\lambda_U+\lambda_V$
The numerator isn't that easy.
--------------------------------------------------------
OR maybe we just attack the expected value.

$E(V+W|U+V) =E(V|U+V)+E(W|U+V) =E(V|U+V)+E(W)$

Since W is independent of U,V, so the second one is easy, but the first is...

So derive the distribution of V given U+V. I think that's been done a lot....

$P(V=b|U+V=a)={P(V=b,U+V=a)\over P(U+V=a)}$

$={P(V=b)P(U=a-b)\over P(U+V=a)}$
• December 6th 2009, 01:07 AM
Beaky
And then to get the expected value, you would have to sum that probability times b from b = 0 to a? Or is there an easier way to proceed? I can't get that sum to reduce to anything, and I'm pretty sure it does but maybe I'm just wasting my time.

I'm also fairly sure we're in the same class, Ampersand. I feel like I might have some idea what I'm doing if the prof wasn't so terrible.
• December 6th 2009, 08:09 PM
ampersand
Quote:

Originally Posted by matheagle
First of all this is discrete.
You need to proceed in this manner, and drop the X,Y they are annoying.

$P(V+W=b|U+V=a)={P(V+W=b,U+V=a)\over P(U+V=a)}$

The denominator is easy, it's a Poisson with mean $\lambda_U+\lambda_V$
The numerator isn't that easy.
--------------------------------------------------------
OR maybe we just attack the expected value.

$E(V+W|U+V) =E(V|U+V)+E(W|U+V) =E(V|U+V)+E(W)$

Since W is independent of U,V, so the second one is easy, but the first is...

So derive the distribution of V given U+V. I think that's been done a lot....

$P(V=b|U+V=a)={P(V=b,U+V=a)\over P(U+V=a)}$

$={P(V=b)P(U=a-b)\over P(U+V=a)}$

Thanks alot! Calculating it directly from expectations was alot easier, and I think I ended up with $E (Y|X) = X - \lambda_{1} + \lambda_{3}$, which works out quite nicely when you take $E[E(Y|X)] = \lambda_{2} + \lambda{3}$ as expected

Quote:

Originally Posted by Beaky
And then to get the expected value, you would have to sum that probability times b from b = 0 to a? Or is there an easier way to proceed? I can't get that sum to reduce to anything, and I'm pretty sure it does but maybe I'm just wasting my time.

I'm also fairly sure we're in the same class, Ampersand. I feel like I might have some idea what I'm doing if the prof wasn't so terrible.

Yep, we're definitely in the same class then =p (your final's on Friday, right?)

I'm not too happy with the teaching either...it seems he makes the stuff appear alot harder than it should be. I felt the lectures were pretty disorganized, and a bit unclear at times.

have you gone through past exams?
• December 6th 2009, 08:26 PM
matheagle
Well, after grading my exams tonight,
I'm sure my students feel the same way about me.
and you should hit the THANKs button.
• December 7th 2009, 08:24 AM
Beaky
Quote:

Originally Posted by ampersand
Thanks alot! Calculating it directly from expectations was alot easier, and I think I ended up with $E (Y|X) = X - \lambda_{1} + \lambda_{3}$, which works out quite nicely when you take $E[E(Y|X)] = \lambda_{2} + \lambda{3}$ as expected

Could someone please provide some details as to how to get this, or at least tell me if I'm going about this the right way? The only way I know to solve this would be

$E(V|U+V=a)=\sum_{v=0}^{\infty}v*P(V=v|U+V=a)=\sum_ {v=0}^{a}v*P(V=v|U+V=a)$

which I can't seem to reduce.

Quote:

Yep, we're definitely in the same class then =p (your final's on Friday, right?)

I'm not too happy with the teaching either...it seems he makes the stuff appear alot harder than it should be. I felt the lectures were pretty disorganized, and a bit unclear at times.

have you gone through past exams?
Yeah, final's on friday. This is the most pressure I've ever felt on an exam, since I'm sure I failed the last test and now need to pass this to pass the class. I've only looked at the old exams available on blackboard. If you know of others then please share.
• December 8th 2009, 01:27 AM
Beaky
I'm still really stuck on this, so any help would be much appreciated.

So far, I've got

$E(V|U+V=a)=\sum_{b=0}^{\infty}b*P(V=b|U+V=a)=\sum_ {b=0}^{a}b*P(V=b|U+V=a)$
$=\sum_{b=0}^{a}b*\frac{P(V=b)(P(U=a-b))}{P(U+V=a)}$
$=\sum_{b=0}^{a}b*\frac{\frac{e^{-\lambda_{v}}\lambda_{v}^{b}e^{-\lambda_{u}}\lambda_{u}^{a-b}}{b!(a-b)!}}{\frac{e^{-
\lambda_{v}-\lambda{u}}(\lambda_{v}+\lambda_{u})^{a}}{a!}}$

$=\sum_{b=0}^{a}b*\frac{\lambda_{v}^{b}\lambda_{u}^ {a-b}a!}{(\lambda_{v}+\lambda_{u})^{a}b!(a-b)!}$

which looks a lot like a binomial expansion but I'm not sure it's so easily reduced. I also don't even think it's equivalent to what Ampersand got, which makes a lot more sense. I've spent hours checking for some silly mistake and can't find anything.
• December 8th 2009, 07:59 AM
matheagle
I MADE this comment for a reason............

So derive the distribution of V given U+V. It has been done a lot....

$P(V=b|U+V=a)={P(V=b,U+V=a)\over P(U+V=a)}$

$={P(V=b)P(U=a-b)\over P(U+V=a)}$

--------------------------------------------------

It's easy to prove that if $W\sim P(\lambda_1)$ and $Z\sim P(\lambda_2)$

then $W|W+Z$ is a binomial.

$P(W=a|W+Z=a+b)=
{{e^{-\lambda_1}\lambda_1^a\over a!}{e^{-\lambda_2}\lambda_2^b\over b!}\over {e^{-\lambda_1-\lambda_2}(\lambda_1+\lambda_2)^{(a+b)}\over (a+b)!}}$

$={a+b\choose a} \biggl({\lambda_1\over \lambda_1+\lambda_2}\biggr)^a\biggl({\lambda_2\ove r \lambda_1+\lambda_2}\biggr)^b$

and the mean of a binomial is np.

n=W+Z and $p={\lambda_1\over \lambda_1+\lambda_2}$
• December 8th 2009, 03:24 PM
Beaky
Alright, thanks. I think I've got it now. I had used what was in your previous post in my last attempt, just in a different approach.

The sum I had was actually correct if I had managed to reduce it, but I assumed Ampersand had the right answer and it wasn't matching up.
• December 8th 2009, 04:27 PM
ampersand
Hmm, maybe I did it wrong then?

I tackled it from expectations, but instead did:

Let's fix $X=x$, where $X=U+V, Y=V+W$, so $Y=x-U+W$, and so $E(Y|X=x)=E(x-U+W)=E(x)-E(U)+E(W)=x-\lambda_{1}+\lambda_{3}$ (since we fix x)

so, if we consider $E(Y|X)$ instead of $E(Y|X=x)$, it becomes: $E(Y|X) = X - \lambda_{1}+\lambda_{3}$

So, did I mis-step at the $E(U)=\lambda{1}$ when it should have been $E(U|X=x) =$ the binomial expectation (np)?

And also, if we indeed had:
$E(Y|X=x) = x - \lambda_{1}+\lambda_{3}$

would we be allowed to just turn that into:

$E(Y|X) = X - \lambda_{1}+\lambda_{3}$?
• December 8th 2009, 04:55 PM
Beaky
I'm not entirely positive, but I think you're right in that you can't do this:

$E(Y|X=x)=E(x-U+W)$

Because U is still dependent on X.

From Matheagle's post, $E(Y|X)=E(V+W|U+V) =E(V|U+V)+E(W|U+V) =E(V|U+V)+E(W)$

and $P(V=b|U+V=x)$ has a binomial distribution, and so you can work out its mean to be $x*\frac{\lambda_{v}}{\lambda_{v}+\lambda_{u}}$

And so $E(V|U+V=x)+E(W)=x*\frac{\lambda_{v}}{\lambda_{v}+\ lambda_{u}}+\lambda_{w}$

Which makes some sense then since you still get

$E[E(V|U+V)]=(\lambda_{v}+\lambda_{u})*\frac{\lambda_{v}}{\lam bda_{v}+\lambda_{u}}=\lambda_{v}$

Also, I think $E(Y|X)$ is just a lazy way of writing $E(Y|X=x)$. I would imagine that both ways you wrote out the answer would be acceptable, although it makes more sense to just write $E(Y|X=x)$ to me.
• December 8th 2009, 05:04 PM
matheagle
Quote:

Originally Posted by Beaky
I'm not entirely positive, but I think you're right in that you can't do this:

$E(Y|X=x)=E(x-U+W)$

Because U is still dependent on X.

From Matheagle's post, $E(Y|X)=E(V+W|U+V) =E(V|U+V)+E(W|U+V) =E(V|U+V)+E(W)$

and $P(V=b|U+V=x)$ has a binomial distribution, and so you can work out its mean to be $x*\frac{\lambda_{v}}{\lambda_{v}+\lambda_{u}}$

And so $E(V|U+V=x)+E(W)=x*\frac{\lambda_{v}}{\lambda_{v}+\ lambda_{u}}+\lambda_{w}$

Which makes some sense then since you still get

$E[E(V|U+V)]=(\lambda_{v}+\lambda_{u})*\frac{\lambda_{v}}{\lam bda_{v}+\lambda_{u}}=\lambda_{v}$

Also, I think $E(Y|X)$ is just a lazy way of writing $E(Y|X=x)$. I would imagine that both ways you wrote out the answer would be acceptable, although it makes more sense to just write $E(Y|X=x)$ to me.

$E(Y|X)$ is the same as $E(Y|X=x)$.

x is a just a realization of X, one of the possibilities.
Capital X is the random variable, x is one of the possible outcomes.
• December 9th 2009, 03:34 AM
Moo
$E[Y|X=x]=\varphi(x)$ (function that you can calculate)

$E[Y|X]=\varphi(X)$ (function taken at X)