Finding E(X) from distribution function

Jan 2009
405
2
Theorem: Let F(x) be the distribution function of X.
If X is any r.v. (discrete, continuous, or mixed) defined on the interval [a,) (or some subset of it), then
E(X)=

∫ [1 - F(x)]dx + a
a

1) Is this formula true for any real number a? In particular, is it true for a<0?

2) When is this formula ever useful (computationally)? Why don't just get the density function then integrate to find E(X)?

Thanks for clarifying!

[also under discussion in talk stats forum]
 
Last edited:

Moo

MHF Hall of Honor
Mar 2008
5,618
2,802
P(I'm here)=1/3, P(I'm there)=t+1/3
  • Like
Reactions: kingwinner