Note: This forms part of an assignment I'm on the hook for, so I'm really looking for pointers on where I'm going wrong as opposed to just a 'here's your answer' reply

The Problem: we have a random variable with a known probability density function:

f(x) = c*x^2, for (0 <= x <= 1), f(x) is 0 otherwise.

I can find c, and subsequently determine the Expected Value (E(X)) of the r.v. using moments approach (ie: integrating for (x*f(x)) to find E(X) in this case). *BUT* to verify my result, I'm trying to determine the Moment Generating Function (MGF) for the given distribution & then solve for the 1st moment. I'm expecting to get the same result for E(X) as obtained above but instead I'm getting a mess.

I'm doing this by hand at the moment so it could be I've just borked my arithmetic.

Questions:

1) I'm assuming I can, in this particular case, determine the MGF and that it'll derive the moments I need for me. Is that sane?

2) The MGF I get (through an iterative process of integrating by parts on the back of a piece of paper): (Mx(t)= 6*(e^t - 1)/t^3 - 6*e^t/t^2 + 3*e^t/t )

I'm sure my result for (2) is incorrect, or I'm completely missing something, since obviously the first moment isn't solvable (dM/dt for t=0 blows up), thus I can't verify the mean is the same value as I calculated originally above.

Expert help gratefully appreciated