# Thread: Convergence in Probability Questions

1. ## Convergence in Probability Questions

I am stuck with this questions. I am supposed to use the following definition to prove them:
$X_n \stackrel{\mbox{P}}{\longrightarrow} X$ if $\forall \epsilon > 0$

$\lim_{n \to \infty} P[|X_n-X| \geq \epsilon]=0$
or equivalently
$\lim_{n \to \infty} P[|X_n-X| < \epsilon]=1$

1. Prove the following. Suppose $X_n \stackrel{\mbox{P}}{\longrightarrow} a$ and $g$ is a real function continuous at $a$. Then $g(X_n) \stackrel{\mbox{P}}{\longrightarrow} g(X)$

2. Let $\{a_n\}$ be a sequence of real numbers. Hence, we can also say that $\{a_n\}$ is a sequence of constant (degenerate) random variables. Let $a$ be a real number. Show that $a_n \longrightarrow a$ is equivalent to $a_n \stackrel{\mbox{P}}{\longrightarrow} a$
(is this proving that convergence in probability is equivalent as pointwise converge in sequences??)

2. Hello,
Originally Posted by akolman
I am stuck with this questions. I am supposed to use the following definition to prove them:
$X_n \stackrel{\mbox{P}}{\longrightarrow} X$ if $\forall \epsilon > 0$

$\lim_{n \to \infty} P[|X_n-X| \geq \epsilon]=0$
or equivalently
$\lim_{n \to \infty} P[|X_n-X| < \epsilon]=1$

1. Prove the following. Suppose $X_n \stackrel{\mbox{P}}{\longrightarrow} a$ and $g$ is a real function continuous at $a$. Then $g(X_n) \stackrel{\mbox{P}}{\longrightarrow} g(X)$
Remember the definition (or property) of the continuity :
f is continuous $\Leftrightarrow \forall \delta>0, \exists \epsilon>0,~ \forall x,~ |x-c|<\epsilon \Rightarrow |f(x)-f(c)|<\delta$
(I inverted delta and epsilon from the usual definition so that it wouldn't confuse with the epsilon you're using in the text)

This means that the set : $\{\omega ~:~ |X_n(\omega)-a|< \epsilon \} \subseteq \{\omega ~:~ |g(X_n(\omega)-g(a)|< \delta \}$

One of the consequences of the Kolmogorov axiomatic is that if $A \subseteq B$, then $\mathbb{P}(A) \leqslant \mathbb{P}(B)$

Then use the second axiom of Kolmogorov ( $\mathbb{P}(A)\leqslant 1$) to prove that $\lim_{n \infty} \mathbb{P}(|g(X_n)-g(a)|<\delta)=1$

It should be easy from here

(is this proving that convergence in probability is equivalent as pointwise converge in sequences??)
Yes.

I haven't been thinking very long on it, but my first idea would be, as above, to use the epsilon definition of $a_n \to a$

3. Thank you very much. I got confused with the epsilons.

4. I am sorry to bother you again, but I don't get this crucial step.

Originally Posted by Moo
Hello,

Remember the definition (or property) of the continuity :
f is continuous $\Leftrightarrow \forall \delta>0, \exists \epsilon>0,~ \forall x,~ |x-c|<\epsilon \Rightarrow |f(x)-f(c)|<\delta$
(I inverted delta and epsilon from the usual definition so that it wouldn't confuse with the epsilon you're using in the text)

This means that the set : $\{\omega ~:~ |X_n(\omega)-a|< \epsilon \} \subseteq \{\omega ~:~ |g(X_n(\omega))-g(a)|< \delta \}$
Why is the set $\{\omega ~:~ |X_n(\omega)-a|< \epsilon \} \subseteq \{\omega ~:~ |g(X_n(\omega))-g(a)|< \delta \}$??

EDIT: typo corrected.

5. Typo....with $g(x_n(\omega))$
Originally Posted by moo
hello,

remember the definition (or property) of the continuity :
F is continuous $\leftrightarrow \forall \delta>0, \exists \epsilon>0,~ \forall x,~ |x-c|<\epsilon \rightarrow |f(x)-f(c)|<\delta$
(i inverted delta and epsilon from the usual definition so that it wouldn't confuse with the epsilon you're using in the text)

this means that the set : $\{\omega ~:~ |x_n(\omega)-a|< \epsilon \} \subseteq \{\omega ~:~ |g(x_n(\omega))-g(a)|< \delta \}$

one of the consequences of the kolmogorov axiomatic is that if $a \subseteq b$, then $\mathbb{p}(a) \leqslant \mathbb{p}(b)$

then use the second axiom of kolmogorov ( $\mathbb{p}(a)\leqslant 1$) to prove that $\lim_{n \infty} \mathbb{p}(|g(x_n)-g(a)|<\delta)=1$

it should be easy from here (rofl)

yes.

I haven't been thinking very long on it, but my first idea would be, as above, to use the epsilon definition of $a_n \to a$ (nod)

6. Originally Posted by matheagle
Typo....with $g(x_n(\omega))$
Why is it a typo ?
Or more exactly... where is the typo ? ^^

Originally Posted by akolman
I am sorry to bother you again, but I don't get this crucial step.

Why is the set $\{\omega ~:~ |X_n(\omega)-a|< \epsilon \} \subseteq \{\omega ~:~ |g(X_n(\omega))-g(a)|< \delta \}$??

EDIT: typo corrected.
Okay, so imagine there is an $\omega$ such that $|X_n(\omega)-a|< \epsilon$
The implication means that with that same $\omega$, $|g(X_n(\omega))-g(a)|<\delta$

Thus the set of $\omega$ such that $|X_n(\omega)-a|< \epsilon$ is included in the set of $\omega$ such that $|g(X_n(\omega))-g(a)|<\delta$.

7. It was a minor typo, you left out the second ).
I thought I pointed that out.

8. Thanks again Moo.

I'm still halfway with this problem

2. Let $\{a_n\}$ be a sequence of real numbers. Hence, we can also say that $\{a_n\}$ is a sequence of constant (degenerate) random variables. Let $a$ be a real number. Show that $a_n \Rightarrow a$ is equivalent to $a_n \stackrel{\mbox{P}}{\Rightarrow} a$

I was able to prove that

Show that $a_n \longrightarrow a$ $\Rightarrow$
$a_n \stackrel{\mbox{P}}{\longrightarrow} a$

But I don't know what to do to go the other way around.

If I assume $a_n \stackrel{\mbox{P}}{\longrightarrow} a$.

Then, $\forall \epsilon_{1}>0, \forall \epsilon_{2}, \exists N>0,~ \forall n > N, ~P(|a_n-a|<\epsilon_{1}) < \epsilon_{2}$

Is there a way to make $P(|a_n-a|<\epsilon_{1})=0$?

So I can get $|a_n-a|<\epsilon_{1}$, for $n>N^{*},~ \forall \epsilon_{1}>0$. Can I get a $N^{*}$ that makes this happen?

Or how can I go with this proof. Thanks again for your patience.