Let b>a and let X-uniform(a,b) . Prove Var(X) =

Var(X)= E(X^2)-E(X)^2

=integral from a to b of

I know E(X)= (b-a)/2 so E(X^2) must be wrong as when I subtract I do not get required Var(X).

Printable View

- Dec 3rd 2011, 10:59 AMDukevariance of uniform distribution
Let b>a and let X-uniform(a,b) . Prove Var(X) =

Var(X)= E(X^2)-E(X)^2

=integral from a to b of

I know E(X)= (b-a)/2 so E(X^2) must be wrong as when I subtract I do not get required Var(X). - Dec 3rd 2011, 11:23 AMPlatoRe: variance of uniform distribution
- Dec 3rd 2011, 12:07 PMDukeRe: variance of uniform distribution
but surely there is a missing factor of 1/(b-a) on the RHS

- Dec 3rd 2011, 12:18 PMPlatoRe: variance of uniform distribution
- Dec 3rd 2011, 12:21 PMDukeRe: variance of uniform distribution
So does it work out correctly for you ?

- Dec 3rd 2011, 12:29 PMPlatoRe: variance of uniform distribution
- Dec 3rd 2011, 12:39 PMDukeRe: variance of uniform distribution
so you are saying b^3-a^3=(b-a)(b+a)^2

- Dec 3rd 2011, 01:12 PMPlatoRe: variance of uniform distribution
- Dec 3rd 2011, 04:18 PMPlatoRe: variance of uniform distribution