I hope some of you can help me out a little.

I read the following in a article, like it's an obvious fact. (http://research.microsoft.com/en-us/...papers/poi.pdf page 18, lemma 11)

Let $\displaystyle \mathcal{F}$ be a $\displaystyle \sigma$-algebra on the probability space $\displaystyle \Omega=\mathbb{R}$, and let

$\displaystyle S=[-r,r]$ be some bounded interval.

Let $\displaystyle \mathcal{F}_S$ be the $\displaystyle \sigma$-algebra, generated by the restriction of $\displaystyle \mathcal{F}$ to $\displaystyle S$. Let $\displaystyle A\in \mathcal{F} = \mathcal{F}_{\mathbb{R}}$. Why does there exist for every $\displaystyle \epsilon>0$ some $\displaystyle r=r(\epsilon)$ and an event $\displaystyle A_{\epsilon}\in \mathcal{F}_{[-r,r]}$ such that $\displaystyle \mathbb{P}(A\Delta A_{\epsilon}) < \epsilon$. ??

In otherwords, $\displaystyle A$ can sufficiently be approximated by $\displaystyle A_{\epsilon}$. Is this obvious?

Is it because $\displaystyle \lim_{r\to\infty}\mathcal{F}_{[-r,r]}=\mathcal{F}$...(and continuity of the probability measure $\displaystyle \mathbb{P}$?)

Also is claimed that $\displaystyle \mathcal{F}_{(-2r,0]}\subset \mathcal{F}_{(-\infty,0]}$ Is this true?

Something general like $\displaystyle \mathcal{F}_A\subset \mathcal{F}_B$ if $\displaystyle A\subset B$, does not hold right?

Really looking forward to your suggestions.