# Math Help - Finding E(X) from distribution function

1. ## Finding E(X) from distribution function

Theorem: Let F(x) be the distribution function of X.
If X is any r.v. (discrete, continuous, or mixed) defined on the interval [a,) (or some subset of it), then
E(X)=

∫ [1 - F(x)]dx + a
a

1) Is this formula true for any real number a? In particular, is it true for a<0?

2) When is this formula ever useful (computationally)? Why don't just get the density function then integrate to find E(X)?

Thanks for clarifying!

[also under discussion in talk stats forum]

2. Try to apply a similar argument to what you've been shown here : http://www.mathhelpforum.com/math-he...pectation.html
As to know how it can be useful, well, it's an easy relationship between cumulative density function and expectation. If one watns to compute the expectation while only the cdf is available, you don't have to differentiate the cdf to get the pdf and then apply the usual formula.