Maximum Likelihood Estimator

There are X_{i}, i=1,2,...,n independent random variables with probability density functions f_{i}(x_{i};theta)= (2/i*theta)*(x_{i}/i*theta), for 0<x_{i}<i*theta and zero otherwise, i=1,2,...,n and theta>0

I need to show that T_{n}=max{X_{1}/1, X_{2}/2,...,X_{n}/n} is the maximum likelihood estimator of theta.

Can somebody give me some help on how I would go about showing this?

Thanks

Re: Maximum Likelihood Estimator

Hey Mick.

What is the likelihood of the distribution: (Hint: it will look like an order statistic)?

Re: Maximum Likelihood Estimator

The likelihood is the product from i=1 to n of (2/i*theta)*(X_{i}/i*theta).

Which gives something like (2/theta)*(X_{1}/theta)*(2/2theta)*(X_{2}/2theta)*...*(2/n*theta)*(X_{n}/n*theta)

Is this right?

Re: Maximum Likelihood Estimator

Yes that's correct: now take the logarithm of that likelihood, differentiate the log-likelihood function (the log of the likelihood) and find the maximum of this function.

The reason we use the logarithmic version is because its monotonic increasing (always increases) and gives the same answer than if we solved in multiplication form and because it turns products into sums, it means that makes the likelihood function (log likelihood) easier to deal with, differentiate, and solve.

Re: Maximum Likelihood Estimator

So the log-likelihood is the log of the above, which using laws of logs can be expressed as :

log(2X_{1})+log(2X_{2})+....+log(2X_{n})-log(theta^2)-log(4theta^2)-...-log(n^2theta^2), which can be expressed using summations as:

sum (i=1 to n) log(2X_{i}) - sum (i=1 to n) 2log(i*theta)

when differentitating with respect to theta the first summation term disappears as it is a constant with respect to theta, so I'm differentiating the sum (i=1 to n) 2log(i*theta). Multiplied by (-1) for the negative of course.

Which I get as (-1)* the sum (i=1 to n) 2/i. i.e. dl/d*theta = -2/1 - 2/2 -2/3 -2/4-...- 2/n.

Which surely can't be right as this is a constant and can never equal zero?

Re: Maximum Likelihood Estimator

Actually although I did point this out with the order statistic comment, I should have been more specific: you need to use the order statistic to estimate this parameter since the x's and the theta's are not independent: You can't use the MLE estimator for this case.

Have you looked at order statistics and order distributions?

Re: Maximum Likelihood Estimator

Can you expand a little please?

Does this mean that the likelihood I derived in post three is not correct? I've done a bit on order statistics but not regarding MLE's.

Re: Maximum Likelihood Estimator

Yes it means that you can't use the likelihood function that you have posed since the xi's are linked to the parameter of interest.

Instead you need to find a likelihood that is based on the order statistic distribution and then use that log-likelihood to get the estimate of the parameter.

Intuitively you know that x_i <i*theta so this will be your bound for theta and the value should be the maximum of all values of x_i/i.

If you need an example, take a look at the derivation of the MLE estimator for a continuous uniform distribution using the order statistic distribution to get the parameter (which will be the maximum of all values of x for a uniform U(0,a) where a > 0).

Re: Maximum Likelihood Estimator

I still don't understand how I go about finding the likelihood function. What would the first steps be?

Re: Maximum Likelihood Estimator

The MLE distribution involving dependencies between state-space and parameters means you are looking for the distribution of Max[X_i/i]. You can write this formally using indicator functions where you choose a particular observation if X_i/i is bigger.

If you have a textbook example of the MLE estimator for theta of a U(0,1) or U(0,a) where a > 0 then it is very similar to that distribution and I'll leave you with the details unless you have another specific question.