1. ## Probability Generating Function.

I found this in one of the review exercises.

Let X be a discrete random variable with values in $\mathbb Z_{+} := {0,1,2, ...}$ with probability generating function

$P_{X}(z) = \mathbb E(z^{X}) = \sum^{+ \infty}_{k=0} \mathbb P (X = k)z^{k}$

(i) Show that $P_X$ defines a mapping from [0,1] to [0,1].
(ii) Let $(X_n)$ be a sequence of i.i.d random variables with values in $\mathbb Z_{+}$ and $T$ a random variable with values in $\mathbb Z_{+}$ which is independent of $(X_n)$. Define

$S_{0} = 0$ and $S_{T} = \sum^{T}_{j=1} X_{j}$

Show that if we denote by $P_{T}$ and $P_{X}$ the probability the generating functions of $T$ and $X_{1}$ respectively, then the probability generating function of S is given by

$\forall z \in [0,1], P_{S}(z) = P_{T}(P_{X}(z))$.

Thank you (:

2. Hello,
Originally Posted by panda*
I found this in one of the review exercises.

Let X be a discrete random variable with values in $\mathbb Z_{+} := {0,1,2, ...}$ with probability generating function

$P_{X}(z) = \mathbb E(z^{X}) = \sum^{+ \infty}_{k=0} \mathbb P (X = k)z^{k}$

(i) Show that $P_X$ defines a mapping from [0,1] to [0,1].
I'll think on this one later ^^
(ii) Let $(X_n)$ be a sequence of i.i.d random variables with values in $\mathbb Z_{+}$ and $T$ a random variable with values in $\mathbb Z_{+}$ which is independent of $(X_n)$. Define

$S_{0} = 0$ and $S_{T} = \sum^{T}_{j=1} X_{j}$

Show that if we denote by $P_{T}$ and $P_{X}$ the probability the generating functions of $T$ and $X_{1}$ respectively, then the probability generating function of S is given by

$\forall z \in [0,1], P_{S}(z) = P_{T}(P_{X}(z))$.

Thank you (:
Like for any problem of this kind, take a partition of the set of events.
Namely $\Omega=\bigcup_{k=0}^\infty \{T=k\}$

({.} denotes an event... if you prefer, it's the set of $\omega\in\Omega$ such that the even happens)

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

Thus
\begin{aligned}
\{S_T=n\}&=\{S_T=n\}\cap \Omega \\
&=\{S_T=n\}\cap \bigcup_{k=0}^\infty \{T=k\} \\
&=\bigcup_{k=0}^\infty \{S_T=n ~,~ T=k\} \quad (\star)
\end{aligned}

Hence :
\begin{aligned}
\mathbb{P}(S_T=n) &=\mathbb{P}\left(\bigcup_{k=0}^\infty \{S_T=n ~,~ T=k\}\right) \\
&=\sum_{k=0}^\infty \mathbb{P}(S_T=n ~,~ T=k) \quad (\star\star) \\
&=\sum_{k=0}^\infty \mathbb{P}(S_T=n \mid T=k)\mathbb{P}(T=k) \quad (\star\star\star) \\
&=\sum_{k=0}^\infty \mathbb{P}(S_k=n)\mathbb{P}(T=k) \end{aligned}

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

We have :
\begin{aligned}P_S(z) &=\mathbb{E}(z^{S_T}) \\
&=\sum_{n=0}^\infty \mathbb{P}(S_T=n)z^n \\
&=\sum_{n=0}^\infty \sum_{k=0}^\infty \mathbb{P}(S_k=n)\mathbb{P}(T=k)z^n \\
&=\sum_{k=0}^\infty \mathbb{P}(T=k) \sum_{n=0}^\infty \mathbb{P}(S_k=n)z^n \end{aligned}

${\color{white}P_S(z)}=\sum_{k=0}^\infty \mathbb{P}(T=k) P_{S_k}(z)$

But what is $P_{S_k}(z)$ ?
$S_k=\sum_{j=1}^k$

Hence $P_{S_k}(z)=\mathbb{E}(z^{S_k})=\mathbb{E}\left(z^{ X_1+\dots+X_k}\right)=\mathbb{E}(z^{X_1}\cdots z^{X_k})$

Since the Xi's are independent, this is equal to the product of the expectations :
$P_{S_k}(z)=\mathbb{E}(z^{X_1})\cdots\mathbb{E}(z^{ X_k})$

Since the Xi's follow the same distribution as X, this is :
$P_{S_k}(z)=[P_X(z)]^k$

Now finally, we have :
$P_S(z)=\sum_{k=0}^\infty \mathbb{P}(T=k)[P_X(z)]^k$

Can you recognize that it's the pgf of T, taken at a particular value ?
Spoiler:
namely at $P_X(z)$

And you're done... I hope you like it, because it's been quite loooooooong to type lol!

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

$(\star)$ is obtained just by using the distributive law of intersection and union. It shouldn't be difficult to prove it.

$(\star\star)$ because for $k\neq m$, {T=k} and {T=m} are disjoint (intersection = empty set). And hence { $S_T$=n , T=k} and { $S_T$=n , T=m} are disjoint.
The equality follows from the third axiom of Kolmogorov.

$(\star\star\star)$ is actually Bayes' formula

Edit : lol...that's my second 5th post XD

3. Originally Posted by panda*
I found this in one of the review exercises.

Let X be a discrete random variable with values in $\mathbb Z_{+} := {0,1,2, ...}$ with probability generating function

$P_{X}(z) = \mathbb E(z^{X}) = \sum^{+ \infty}_{k=0} \mathbb P (X = k)z^{k}$

(i) Show that $P_X$ defines a mapping from [0,1] to [0,1].
[snip]
Observe that $P_X(0) = P(X = 0)$, $P_X(1) = 1$, and $P_X(z)$ is an increasing function of z.

4. Moo, thanks so much for your effort! I understand better now. Thank you again.

Awkward, is that all you have to show?