Let (X,M,$\displaystyle \mu$) be a measure space and let $\displaystyle f\in L^{+}$ be integrable ($\displaystyle \int$ $\displaystyle f $ $\displaystyle d\mu$<$\displaystyle \infty$). Show that for each $\displaystyle \epsilon$ > 0 there is a $\displaystyle \delta$ > 0 such that if A $\displaystyle \subset$ X and $\displaystyle \mu(A)$<$\displaystyle \delta$,

then $\displaystyle \int$$\displaystyle f$<$\displaystyle \epsilon$ where the integral is calculate over the subset A. Show that the result may fail if the assumption on the integrability of f is dropped.