marginal moment for random sums - question about proof

Suppose that {$\displaystyle X_j$} is a sequence of intedependent identically distributed random variables and that N is a random variable taking non-negative integer values.

If $\displaystyle Y=\Sigma_{j=1}^N$, then $\displaystyle M_Y(t)=M_N(lnM_X(t))$

Proof.

$\displaystyle M_Y(t)=E[M_{Y|N}(t|N)]=E[(M_X(t)^N]=E[e^{NXt}]=E[exp({N*ln(e^{xt})]=$

$\displaystyle =E(e^{N*lnM_X(t)})=M_N(lnM_X(t))$

The beginning and the end is from the textbook, the part in the middle with $\displaystyle E[e^{NXt}]=E[exp({N*ln(e^{xt})]$ is mine.

I am not quite sure about the last transition

$\displaystyle E[exp({N*ln(e^{xt})]=E(e^{N*lnM_X(t)})=M_N(lnM_X(t))$.

I can see how $\displaystyle E[e^{N*something}]$ becomes $\displaystyle M_N(something)$. I am not sure why $\displaystyle ln(e^{xt})=lnM_X(t)$ if there is no expected value E anywhere around (since the definition of MGF is $\displaystyle M_X(t)=E(e^{Xt}$).