Theorem:Let F(x) be the distribution function of X.

If X is any r.v. (discrete, continuous, or mixed) defined on the interval [a,∞) (or some subset of it), then

E(X)=

∞

∫ [1 - F(x)]dx + a

a

1) Is this formula true for any real number a? In particular, is it true for a<0?

2) When is this formula ever useful (computationally)? Why don't just get the density function then integrate to find E(X)?

Thanks for clarifying!

[also under discussion in talk stats forum]