I am having a tough time with these. For instance, take the following distribution: f(x) = 1, 0<x<1, and f(x) = 0 elsewhere. Now I was under the impression that I had to integrate (e^(tx) f(x) dx) over all values of x, which yields e^t / t. But the back of the book says that the moment generating function is (2e^t / (3 - e^t)). I have no clue how they got it. Is it simply a matter of integrating x from negative to positive infinity, instead of from 0 to 1? Or is there more to it than that?