Let b>a and let X-uniform(a,b) . Prove Var(X) =

Var(X)= E(X^2)-E(X)^2

=integral from a to b of

I know E(X)= (b-a)/2 so E(X^2) must be wrong as when I subtract I do not get required Var(X).

Printable View

- Dec 3rd 2011, 09:59 AMDukevariance of uniform distribution
Let b>a and let X-uniform(a,b) . Prove Var(X) =

Var(X)= E(X^2)-E(X)^2

=integral from a to b of

I know E(X)= (b-a)/2 so E(X^2) must be wrong as when I subtract I do not get required Var(X). - Dec 3rd 2011, 10:23 AMPlatoRe: variance of uniform distribution
- Dec 3rd 2011, 11:07 AMDukeRe: variance of uniform distribution
but surely there is a missing factor of 1/(b-a) on the RHS

- Dec 3rd 2011, 11:18 AMPlatoRe: variance of uniform distribution
- Dec 3rd 2011, 11:21 AMDukeRe: variance of uniform distribution
So does it work out correctly for you ?

- Dec 3rd 2011, 11:29 AMPlatoRe: variance of uniform distribution
- Dec 3rd 2011, 11:39 AMDukeRe: variance of uniform distribution
so you are saying b^3-a^3=(b-a)(b+a)^2

- Dec 3rd 2011, 12:12 PMPlatoRe: variance of uniform distribution
- Dec 3rd 2011, 03:18 PMPlatoRe: variance of uniform distribution