# Math Help - restriction of sigma-algebra

1. ## restriction of sigma-algebra

I hope some of you can help me out a little.

I read the following in a article, like it's an obvious fact. (http://research.microsoft.com/en-us/...papers/poi.pdf page 18, lemma 11)

Let $\mathcal{F}$ be a $\sigma$-algebra on the probability space $\Omega=\mathbb{R}$, and let
$S=[-r,r]$ be some bounded interval.

Let $\mathcal{F}_S$ be the $\sigma$-algebra, generated by the restriction of $\mathcal{F}$ to $S$. Let $A\in \mathcal{F} = \mathcal{F}_{\mathbb{R}}$. Why does there exist for every $\epsilon>0$ some $r=r(\epsilon)$ and an event $A_{\epsilon}\in \mathcal{F}_{[-r,r]}$ such that $\mathbb{P}(A\Delta A_{\epsilon}) < \epsilon$. ??

In otherwords, $A$ can sufficiently be approximated by $A_{\epsilon}$. Is this obvious?
Is it because $\lim_{r\to\infty}\mathcal{F}_{[-r,r]}=\mathcal{F}$...(and continuity of the probability measure $\mathbb{P}$?)

Also is claimed that $\mathcal{F}_{(-2r,0]}\subset \mathcal{F}_{(-\infty,0]}$ Is this true?

Something general like $\mathcal{F}_A\subset \mathcal{F}_B$ if $A\subset B$, does not hold right?

Really looking forward to your suggestions.