I have two questions:

1) Let X_1, X_2, ..., X_n be i.i.d. from a distribution with pdf:

I am supposed to find the MLE (maxmium likelihood estimator) for theda. However, if I take the derivative of the likelihood function (Likelihood function given below) and set it equal to zero, it doesn't work.

I must maximize the likelihood function a different way then. I argue that in order for the likelihood function to be maximized,

(which is the first order statistic; thus "theda hat" equals the minimum).

Is my logic correct?

2) Let X_1, X_2, ..., X_n be i.i.d. with distribution Laplace(u,1). Find the Fisher information for u.

Here's the pdf:

When I solve the Fisher information, I_n (u), I get zero...is this right? If not, what is another way to solve it?

Thanks for the help!