# maximum likelihood estimate

• May 29th 2009, 06:57 AM
mahefo
maximum likelihood estimate
I have a question. In the maximum likelihood estimator, $\displaystyle \hat{\theta}$ is called maximum likelihood estimator of $\displaystyle \theta$ if value of $\displaystyle \hat{\theta}$ maximizing $\displaystyle f(x_1,x_2,...,x_n|\theta)$. Why we define like this? why not minimum?($\displaystyle f(x_1,x_2,...,x_n|\theta)$ denote probability mass function of the random variable $\displaystyle X_1,..., X_N$.
Can you explain for me?
Thank you very much.
• May 29th 2009, 09:49 AM
HallsofIvy
Quote:

Originally Posted by mahefo
I have a question. In the maximum likelihood estimator, $\displaystyle \hat{\theta}$ is called maximum likelihood estimator of $\displaystyle \theta$ if value of $\displaystyle \hat{\theta}$ maximizing $\displaystyle f(x_1,x_2,...,x_n|\theta)$. Why we define like this? why not minimum?($\displaystyle f(x_1,x_2,...,x_n|\theta)$ denote probability mass function of the random variable $\displaystyle X_1,..., X_N$.
Can you explain for me?
Thank you very much.

Why would you want to minimize the probability? That would be saying that you estimate the parameter to be that value that makes the sample you go least likely! While it doesn't always happen you would expect most often to get the result that is most likely, not least likely. So it makes far more sense to assume your result was what was most likely.