# Thread: Questions on conditional and unconditional means and variances

1. ## Questions on conditional and unconditional means and variances

I have a couple of questions in which I require to find the unconditional means and unconditional variances given the conditional probabilty distribution.

1. Suppose the random variable Y is given by P(Y=y|X=x)=(1-x)x^y. y=0,1,2,..,infinity; 0<x<1

I need to show that conditionally Var(Y|X=x)= E(Y|X=x)[1+E(Y|X=x)] and unconditionally Var(Y)=E(Y)[1+E(Y)]+2*Var(X/(1-X))

2. Let Y have the conditional binomial distribution P(Y=y|X=x)=(xCy)p^y(1-p)^(x-y); y=0,1,...,x; 0<p<1
If X has the Poisson distribution P(X=x)=w^x * exp(-w) / x!; x=0,1,..,infinity

once again I need to find the unconditional mean and variance of Y and the unconditional distribution of Y.

I more or less just need to know where to start. Any help will be greatly appreciated. Thanks in advance.

2. Those questions look suspiciously familiar. Does 'MAB314' ring any bells? Sorry I can't help you with them though, I was going to post them aswell.

3. lol I have no idea what you're talking about

I have managed to work them out though.

4. I was given those exact questions very recently (in fact it was due in on the day you posted it) as part of an assignment for my course on statistical modelling, MAB314 is the course code. They were questions 2 and 4. My lecturer must be lazy and is copying questions from somewhere (or your lecturer is the lazy one, or both).

5. Originally Posted by Nathan_84
I have a couple of questions in which I require to find the unconditional means and unconditional variances given the conditional probabilty distribution.

1. Suppose the random variable Y is given by P(Y=y|X=x)=(1-x)x^y. y=0,1,2,..,infinity; 0<x<1

I need to show that conditionally Var(Y|X=x)= E(Y|X=x)[1+E(Y|X=x)] and unconditionally Var(Y)=E(Y)[1+E(Y)]+2*Var(X/(1-X))

$
Var(Y|X=x)=E((\left y\right|_{x}-\left \bar y \right|_{x})^2)=E((\left y\right|_{x})^2)-(\left \bar y \right|_{x})^2
$

So now we need to know how to find the conditional first and second moments. The is a special case of the conditional expectation:

$
E(f(y)|x)=\int f(y) p(y|x) dy
$

or if Y is discrete:

$
E(f(y)|x)=\sum_{i} f(y_i) p(y_i|x)
$

To complete this we need to know that for $x \in (0,1)$:

$
\sum_{k=0}^{\infty}k x^k=\frac{x}{(1-x)^2}
$

and:

$
\sum_{k=0}^{\infty}k^2 x^k=\frac{x(x+1)}{(1-x)^3}
$

RonL