1. ## Probability Independence

A sequence of
N independent Bernoulli trials is performed, where N in a non-
negative integer-valued random variable, and the probability of success on any one trial is
p. Let S be the total number of success and F be the total number of failures (S + F = N). Show that the joint probability generating function of S and F is given by GS,F (s,t) = GN(ps + (1 -p)t), where GN is the probability generating function of N.
Hint: use conditional expectation, E( .) = E(E( . |N)).

Also, show that if N has a Poisson distribution, then S and F are independent.

What is E( . )? I dont recall ever seeing that. I have no chance in hell of doing this question without help

2. Originally Posted by woody198403

A sequence of
N independent Bernoulli trials is performed, where N in a non-
negative integer-valued random variable, and the probability of success on any one trial is p. Let S be the total number of success and F be the total number of failures (S + F = N). Show that the joint probability generating function of S and F is given by GS,F (s,t) = GN(ps + (1 -p)t), where GN is the probability generating function of N.
Hint: use conditional expectation, E( .) = E(E( . |N)).

Also, show that if N has a Poisson distribution, then S and F are independent.

What in gods name is E( . )? I dont recall ever seeing that. I have no chance in hell of doing this question without help

This might get you started (but I might be barking up the wrong tree):

1. S ~ Binomial(N, p) and F ~ Bin(N, 1 - p).

2. $m_{S, F} (t_1, \, t_2) = E\left( e^{t_1 S + t_2 F}\right)$.

3. Originally Posted by woody198403

What is E( . )? I dont recall ever seeing that. I have no chance in hell of doing this question without help

E(.) is the expectation operator, it gives the average of a function of a RV.

RonL