# Thread: Proving Maximum Likelihood Estimator (MLE) of Theta is the sample mean

1. ## Proving Maximum Likelihood Estimator (MLE) of Theta is the sample mean

p(x) = [ ((Theta)^x) * (e^(-Theta)) ] / x! X = 0,1,2,...

You may assume that E(X)= Theta and V(X) = Theta

A random sample were examined....

Show that the maximum likelihood estimator (MLE) of (Theta) is the sample mean.

Any help, or a point in the right direction would be great. Any worked solutions are also welcome!

Thankyou!

2. ## Re: Proving Maximum Likelihood Estimator (MLE) of Theta is the sample mean

The likelihood of a sample of independent x values is: (I assume there are n values in total)

$L(\theta) = \prod_{i=1}^{n} \left[ \frac{\left( \theta^{x_i} \right) e^{-\theta}}{{x_i}!} \right]$

use log likelihood...

$Log L(\theta) = \ln \prod_{i} \left(\frac{\left( \theta^x \right) e^{-\theta}}{x!} \right)$

simplify using log rules
$Log L(\theta) = \sum_i \ln \left[\frac{\left( \theta^x \right) e^{-\theta}}{x!} \right]$

$Log L(\theta) = \sum_i \left[\ln(\theta^x) + \ln(e^{-\theta}) + \ln(x!) \right]$
$Log L(\theta) = \sum_i \left[x\ln(\theta) -\theta + \ln(x!) \right]$

Simplify the sums
$Log L(\theta) = \sum_i x\ln(\theta) -\sum_i \theta + \sum_i \ln(x!)$
$Log L(\theta) = \sum_i x\ln(\theta) -\sum_i \theta + \sum_i \ln(x!)$
$Log L(\theta) = \ln(\theta)\sum_i x -n \theta + \sum_i \ln(x!)$

try and finish yourself from there. if stuck, the solution is in the spoiler.
Spoiler:

Differentiate with respect to theta

$\frac{d(Log L(\theta))}{d \theta} = \frac{\sum_i x}{\theta} -n + 0$

Set the derivative equal to 0 and solve to geth the MLE ( $\hat{\theta}}$)

$0 = \frac{\sum_i x}{\hat{\theta}} - n$
$\hat{\theta} = \frac{\sum_i x}{n}$
$\hat{\theta} = \bar{x}$
Your professor may expect you to prove that the turning point is a maximum, which you can do using whatever methods you prefer.