Proving Maximum Likelihood Estimator (MLE) of Theta is the sample mean

p(x) = [ ((Theta)^x) * (e^(-Theta)) ] / x! X = 0,1,2,...

You may assume that E(X)= Theta and V(X) = Theta

A random sample were examined....

Show that the maximum likelihood estimator (MLE) of (Theta) is the sample mean.

Any help, or a point in the right direction would be great. Any worked solutions are also welcome!

Thankyou!

Re: Proving Maximum Likelihood Estimator (MLE) of Theta is the sample mean

The likelihood of a sample of independent x values is: (I assume there are n values in total)

$\displaystyle L(\theta) = \prod_{i=1}^{n} \left[ \frac{\left( \theta^{x_i} \right) e^{-\theta}}{{x_i}!} \right]$

use log likelihood...

$\displaystyle Log L(\theta) = \ln \prod_{i} \left(\frac{\left( \theta^x \right) e^{-\theta}}{x!} \right)$

simplify using log rules

$\displaystyle Log L(\theta) = \sum_i \ln \left[\frac{\left( \theta^x \right) e^{-\theta}}{x!} \right]$

$\displaystyle Log L(\theta) = \sum_i \left[\ln(\theta^x) + \ln(e^{-\theta}) + \ln(x!) \right]$

$\displaystyle Log L(\theta) = \sum_i \left[x\ln(\theta) -\theta + \ln(x!) \right]$

Simplify the sums

$\displaystyle Log L(\theta) = \sum_i x\ln(\theta) -\sum_i \theta + \sum_i \ln(x!) $

$\displaystyle Log L(\theta) = \sum_i x\ln(\theta) -\sum_i \theta + \sum_i \ln(x!) $

$\displaystyle Log L(\theta) = \ln(\theta)\sum_i x -n \theta + \sum_i \ln(x!) $

try and finish yourself from there. if stuck, the solution is in the spoiler.