Maximum Likelihood Estimator

Oct 2012
25
0
England
There are Xi, i=1,2,...,n independent random variables with probability density functions fi(xi;theta)= (2/i*theta)*(xi/i*theta), for 0<xi<i*theta and zero otherwise, i=1,2,...,n and theta>0

I need to show that Tn=max{X1/1, X2/2,...,Xn/n} is the maximum likelihood estimator of theta.

Can somebody give me some help on how I would go about showing this?

Thanks
 

chiro

MHF Helper
Sep 2012
6,608
1,263
Australia
Hey Mick.

What is the likelihood of the distribution: (Hint: it will look like an order statistic)?
 
Oct 2012
25
0
England
The likelihood is the product from i=1 to n of (2/i*theta)*(Xi/i*theta).

Which gives something like (2/theta)*(X1/theta)*(2/2theta)*(X2/2theta)*...*(2/n*theta)*(Xn/n*theta)

Is this right?
 

chiro

MHF Helper
Sep 2012
6,608
1,263
Australia
Yes that's correct: now take the logarithm of that likelihood, differentiate the log-likelihood function (the log of the likelihood) and find the maximum of this function.

The reason we use the logarithmic version is because its monotonic increasing (always increases) and gives the same answer than if we solved in multiplication form and because it turns products into sums, it means that makes the likelihood function (log likelihood) easier to deal with, differentiate, and solve.
 
Oct 2012
25
0
England
So the log-likelihood is the log of the above, which using laws of logs can be expressed as :

log(2X1)+log(2X2)+....+log(2Xn)-log(theta^2)-log(4theta^2)-...-log(n^2theta^2), which can be expressed using summations as:

sum (i=1 to n) log(2Xi) - sum (i=1 to n) 2log(i*theta)

when differentitating with respect to theta the first summation term disappears as it is a constant with respect to theta, so I'm differentiating the sum (i=1 to n) 2log(i*theta). Multiplied by (-1) for the negative of course.

Which I get as (-1)* the sum (i=1 to n) 2/i. i.e. dl/d*theta = -2/1 - 2/2 -2/3 -2/4-...- 2/n.

Which surely can't be right as this is a constant and can never equal zero?
 
Last edited:

chiro

MHF Helper
Sep 2012
6,608
1,263
Australia
Actually although I did point this out with the order statistic comment, I should have been more specific: you need to use the order statistic to estimate this parameter since the x's and the theta's are not independent: You can't use the MLE estimator for this case.

Have you looked at order statistics and order distributions?
 
Oct 2012
25
0
England
Can you expand a little please?

Does this mean that the likelihood I derived in post three is not correct? I've done a bit on order statistics but not regarding MLE's.
 

chiro

MHF Helper
Sep 2012
6,608
1,263
Australia
Yes it means that you can't use the likelihood function that you have posed since the xi's are linked to the parameter of interest.

Instead you need to find a likelihood that is based on the order statistic distribution and then use that log-likelihood to get the estimate of the parameter.

Intuitively you know that x_i <i*theta so this will be your bound for theta and the value should be the maximum of all values of x_i/i.

If you need an example, take a look at the derivation of the MLE estimator for a continuous uniform distribution using the order statistic distribution to get the parameter (which will be the maximum of all values of x for a uniform U(0,a) where a > 0).
 
Oct 2012
25
0
England
I still don't understand how I go about finding the likelihood function. What would the first steps be?
 

chiro

MHF Helper
Sep 2012
6,608
1,263
Australia
The MLE distribution involving dependencies between state-space and parameters means you are looking for the distribution of Max[X_i/i]. You can write this formally using indicator functions where you choose a particular observation if X_i/i is bigger.

If you have a textbook example of the MLE estimator for theta of a U(0,1) or U(0,a) where a > 0 then it is very similar to that distribution and I'll leave you with the details unless you have another specific question.