1. ## maximum likelihood estimate

I have a question. In the maximum likelihood estimator, $\hat{\theta}$ is called maximum likelihood estimator of $\theta$ if value of $\hat{\theta}$ maximizing $f(x_1,x_2,...,x_n|\theta)$. Why we define like this? why not minimum?( $f(x_1,x_2,...,x_n|\theta)$ denote probability mass function of the random variable $X_1,..., X_N$.
Can you explain for me?
Thank you very much.

2. Originally Posted by mahefo
I have a question. In the maximum likelihood estimator, $\hat{\theta}$ is called maximum likelihood estimator of $\theta$ if value of $\hat{\theta}$ maximizing $f(x_1,x_2,...,x_n|\theta)$. Why we define like this? why not minimum?( $f(x_1,x_2,...,x_n|\theta)$ denote probability mass function of the random variable $X_1,..., X_N$.
Can you explain for me?
Thank you very much.
Why would you want to minimize the probability? That would be saying that you estimate the parameter to be that value that makes the sample you go least likely! While it doesn't always happen you would expect most often to get the result that is most likely, not least likely. So it makes far more sense to assume your result was what was most likely.