A constant is integrable... Because the measure of is 1.
So we have
Let a sequence of random variables on a probability space such that for some constant . Assume that almost surely as . How do you prove that is finite and ?
I obviously thought about using the Dominated Convergence Theorem. Cauchy-Schwarz inequality ensures the expectation is finite. The problem is the sequence of expectations is dominated by a constant, which is not (necessarily) integrable over , and the Theorem cannot be applied as it is. But surely there is something I missed.
Thanks by advance for you help.
Your problem is a consequence of this one, which is more general. In fact, in your case, the proof I gave in the mentioned thread can me much simplified as follows.
The key word is "Egoroff": the convergence is uniform on an arbitrary large event. If the convergence is uniform, then the expectations converge. If where there is not convergence, you can use Cauchy-Schwarz to bound the expectation by a small number.
I let you decrypt the explicitation of the previous sentence: , and . "qed".
Sorry about the silly remark on the integrability of a constant. For some obscure reason I've been wrongly thinking in terms of the Lebesgue measure. Probably because I was desperate. But I was also suggesting that the result was not necessarily dependent on the choice of measure (true apparently).
The Egoroff theorem is indeed mentioned in the book's chapter, so the proof makes perfectly sense. It was finally more tricky than I thought.