1. ## Conditional probability

I'm wondering about the following question:

Let the random variables $\displaystyle$N,X_1,X_2,...$be independent,$\displaystyle $N \in Po(\lambda)$$and \displaystyle X_k \in Be(1/2), k \ge 1$$. Set$\displaystyle $Y_1= \sum_{k=1}^N X_k$$and \displaystyle Y_2 = N - Y_1$$,$\displaystyle $Y_1=0$$for \displaystyle N=0$$. Show that$\displaystyle $Y_1$$and \displaystyle Y_2$$ are independent and determine their distributions. My thought so far: I see that$\displaystyle Y_1$given$\displaystyle N=n$is binomially distributed with parameters n and 1/2. What I want to show is that$\displaystyle \$p_{Y_1,Y_2}(i,j)=p_{Y_1}(i) \cdot p_{Y_2}(j)$$and one can expand \displaystyle p_{Y_1,Y_2}(i,j)=Pr(Y_1=i,Y_2=j) = \sum_{n=0}^N Pr(Y_1=i,Y_2=j | N=n) \cdot Pr(N=n)$$

according to the law of total probability, but I'm kind of stuck after that.

2. ## Re: Conditional probability

Hey MagisterMan.

After looking at this problem, the only reason these are considered independent is because it should meet the criteria (i.e. P(A and B) = P(A)P(B)). Normally given the situation, Y2 would not be independent of Y1 since Y2 is a function of Y1 and N). This is a very special case in that a uniform distribution is completely random.

If you want to do the joint distribution, then show that all probabilities are the same (i.e. given some Y1 = y1, then all probabilities of Y2 = y2 are the same no matter what value of y1) and one way to do this is by using probability generating functions.

As a suggestion, one proof you might want to do is show that if Y1 is uniformly distributed then Y1 and Y2 will be considered independent. Then you can prove that Y2 is uniform and thus your proof is done.