# random variable convergence

Printable View

• Jun 3rd 2011, 09:25 AM
waytogo
random variable convergence
Hello,

question is about about one point in proposition saying that convergence of random variable almost surely implies convergence in probability.

A sequence $(X_n),n\in \mathbb{N}$ of random variables is said to converge almost surely to the random variable X if $P(\omega\mid X_n\to X, n\to \infty)=1$, which is equivalent to $\forall \varepsilon \forall \delta \exists N : P(\bigcap_{n=N}^\infty\[\omega: \mid {X_n - X}\mid< \varepsilon])\geq 1-\delta$.

In proof of this proposition (in book of John B.Thomas) it is said that:
$P(\bigcap_{n=N}^\infty\[\omega: \mid {X_n - X}\mid< \varepsilon])\geq 1-\delta$ is equivalent to $P(\bigcap_{n=N}^\infty\[\omega: \mid {X_n - X}\mid\geq \varepsilon])< \delta$

Although, using De Morgan's law and property of pobability measure I get that
$P(\bigcap_{n=N}^\infty\[\omega: \mid {X_n - X}\mid< \varepsilon])\geq 1-\delta$ is equivalent to $P(\bigcup_{n=N}^\infty\[\omega: \mid {X_n - X}\mid\geq \varepsilon])< \delta$,
which is essentially different statement.

Can anybody comment this?
• Jun 3rd 2011, 12:19 PM
theodds
It looks like a typo to me.

For me, the easiest thing to remember is that $X_n \to X$ almost surely if and only if for every $\epsilon > 0$ we have $P([\omega: |X_n - X| \ge \epsilon \mbox{ for infinitely many n }]) = P(\limsup [\omega: |X_n -X| \ge \epsilon]) = 0$ (the first equality being by definition). Then the result follows from

$\limsup P([\omega: |X_n - X| \ge \epsilon]) \le P(\limsup [\omega: |X_n - X| \ge \epsilon])$.