Hi,

I'm currently reading a book on probability theory (Resnick, "A Probability Path") and in a section on basic set and measure theory, the indicator function is defined as follows:

$\displaystyle $1_{A}\left(\omega\right)=\begin{cases} 1, & \omega\in A\\ 0, & \omega\in A^{c} \end{cases}$$

and some properties are discussed, including:

$\displaystyle $1_{\cup_{n}A_{n}}\le\underset{n}{\sum}1_{A_{n}}$$

for which it is stated: "and if the sequence $\displaystyle $\left\{ A_{n}\right\}$ is mutually disjoint, then equality holds."

However, I don't understand this: the LHS can only ever be 0 or 1, whereas the RHS can be any integer between 0 and the total number of subsets. How could equality ever hold except under specific cases?

Update:Actually, I just realised I'm being stupid. If the sets are disjoint, then the RHS would also only return 0 or 1.

Any pointers greatly appreciated!

Chris