1. ## Bacteria and martingale

Let $\displaystyle N_n$, $\displaystyle n\geq 1$ be the size of a population of bacteria at time step n. At each step each bacteria produces a number of offspring and dies. The number of offspring is independent for each bacteria and is distributed according to the Poisson law with parameter $\displaystyle \lambda = 2$.
Assuming that $\displaystyle N_1=a>0$, the problem is to find the probability that the population will eventually die, e.g. $\displaystyle \mathbb{P}(N_n=0$ for some $\displaystyle n\geq 1)$.
Note there is a hint: find $\displaystyle c$ such that $\displaystyle \exp(-cN_n)$ is a martingale.

This is a Markovian process. Finding c can be done by evaluating the martingale property at $\displaystyle n=1$. Through $\displaystyle c=2(1-e^{-c})$ one finds $\displaystyle c\approx 1.6$. The martingale enables to use the Optional Sampling Theorem with a stopping time.

But which one? The case $\displaystyle \tau = \inf\{n : N_n=0\}$, with $\displaystyle \tau=\infty$ if the event never occurs raises the question of what is $\displaystyle N_\infty$. There is no condition that would allow some form of convergence of the sequence. Is there any way to solve using such stopping time?
Another approach would be to set another barrier, in the form of the stopping time $\displaystyle \tau_m=\inf\{n : N_n=0\ or\ N_n=m\}$, and then take the limit $\displaystyle m\rightarrow\infty$. With this one gets a value for the probability $\displaystyle \mathbb{P}(N_n=0$ for some $\displaystyle n\geq 1)=e^{-ca}$, which I suspect is the right answer.

But this would assume that the process $\displaystyle N_n$ is sure to reach any given bound $\displaystyle m$ set, if it doesn't go to zero. This has surely to do with the nature of the process. In a usual "birth-death" process, one would say that the renewal rate being $\displaystyle \rho=\lambda=2>1$, we know $\displaystyle \lim_{n\rightarrow\infty}\mathbb{E}(N_n)=\infty$. Is there any way to make a statement like this here?

2. Originally Posted by akbar
Let $\displaystyle N_n$, $\displaystyle n\geq 1$ be the size of a population of bacteria at time step n. At each step each bacteria produces a number of offspring and dies. The number of offspring is independent for each bacteria and is distributed according to the Poisson law with parameter $\displaystyle \lambda = 2$.
Assuming that $\displaystyle N_1=a>0$, the problem is to find the probability that the population will eventually die, e.g. $\displaystyle \mathbb{P}(N_n=0$ for some $\displaystyle n\geq 1)$.
Note there is a hint: find $\displaystyle c$ such that $\displaystyle \exp(-cN_n)$ is a martingale.
Here's a way to justify the answer: $\displaystyle M_n=e^{-c N_n}$ is a martingale, and it is positive, hence it converges almost-surely to a random variable $\displaystyle M_\infty\geq 0$. We deduce, taking logarithm, that either $\displaystyle N_n$ converges to a finite limit (if $\displaystyle M_\infty>0$) or to $\displaystyle +\infty$ (if $\displaystyle M_\infty=0$). Of course, since $\displaystyle N_n$ is an integer, the first case implies that this sequence is constant eventually. Obviously this can only happen if the population dies out. I let you devise your own argument for that (for instance, show $\displaystyle P(N_n=N_{n+1}=\cdots=N_{n+k}=m)\to_k 0$ for all $\displaystyle n,m>0$ and conclude, or just recall a result on Markov chains).
As a consequence, $\displaystyle M_\infty$ is either equal to 0 or 1, corresponding to a growth to infinity and to extinction respectively, and thus

$\displaystyle e^{-ca}=E[M_1]=E[M_\infty]=P(\text{extinction})$.

(The martingale is bounded hence the middle equality is just a consequence of the bounded convergence theorem $\displaystyle E[M_1]=E[M_n]\to E[M_\infty]$; no optional sampling here or whatever)

3. Thank you for your answer. Just one last question: which "bounded convergence theorem" are you referring to in your last sentence? Is it "uniformly bounded by integrable function" implies "uniformly integrable"?
Even if it's trivial, there are many theorems of this name and I wanted to be sure we're talking about the same.
Thanks again.

4. Originally Posted by akbar
Thank you for your answer. Just one last question: which "bounded convergence theorem" are you referring to in your last sentence? Is it "uniformly bounded by integrable function" implies "uniformly integrable"?
Even if it's trivial, there are many theorems of this name and I wanted to be sure we're talking about the same.
Thanks again.
Maybe you know it as Lebesgue's dominated convergence theorem : $\displaystyle |M_n|\leq 1$ which is integrable, and $\displaystyle M_n\to_n M_\infty$ hence $\displaystyle E[M_n]\to_n E[M_\infty]$. No uniform integrability, just very basic measure theory.

5. Yes the Dominated Convergence Theorem, which is obvious here. But I didn't want to miss an opportunity to learn another potential theorem about integration...
Thanks a lot.

6. Originally Posted by akbar
Yes the Dominated Convergence Theorem, which is obvious here. But I didn't want to miss an opportunity to learn another potential theorem about integration...
Thanks a lot.
At least you learned another name ;-) In fact, the name "bounded convergence theorem" relates only to the present case: if the measure space has finite measure, and the sequence is uniformly bounded by a constant.