# Math Help - Advanced Statistics Questions

I have two questions:

1) Let X_1, X_2, ..., X_n be i.i.d. from a distribution with pdf:

I am supposed to find the MLE (maxmium likelihood estimator) for theda. However, if I take the derivative of the likelihood function (Likelihood function given below) and set it equal to zero, it doesn't work.

I must maximize the likelihood function a different way then. I argue that in order for the likelihood function to be maximized,
(which is the first order statistic; thus "theda hat" equals the minimum).

Is my logic correct?

2) Let X_1, X_2, ..., X_n be i.i.d. with distribution Laplace(u,1). Find the Fisher information for u.

Here's the pdf:

When I solve the Fisher information, I_n (u), I get zero... is this right? If not, what is another way to solve it?

Thanks for the help!

2. nope, the largest the joint density can be is when theta is as small as possible.
You have $I(0\le X_{(n)}\le \theta)$.
Hence the LARGEST order stat not the smallest is the MLE.
It's sufficient too.

3. Thanks, matheagle! Considering I did that one wrong, check my work on the following (which I thought I did correctly):
Let X_1, X_2, ... , X_n be i.i.d. from a distribution with a pdf:

FYI: I noticed that this a:
.

Find the MLE for sigma. Again, like the previous problem, the derivative of the likelihood function won't help me, so I look the at likelihood function:

I argue that the likelihood function is maximized when sigma is minimized. However, sigma can only be as small as the largest order statistic because of the bounds of X. So,

Am I right? I am a little hesitant of my answer because mu is also in the bounds of X, so I don't know if I have to include mu somehow in there as well.

Thanks again!

4. Sorry. I'm no longer answering questions here.