Results 1 to 2 of 2

Thread: Proving Maximum Likelihood Estimator (MLE) of Theta is the sample mean

  1. #1
    Newbie
    Joined
    Mar 2013
    From
    Bolton
    Posts
    1

    Question Proving Maximum Likelihood Estimator (MLE) of Theta is the sample mean

    p(x) = [ ((Theta)^x) * (e^(-Theta)) ] / x! X = 0,1,2,...

    You may assume that E(X)= Theta and V(X) = Theta

    A random sample were examined....

    Show that the maximum likelihood estimator (MLE) of (Theta) is the sample mean.



    Any help, or a point in the right direction would be great. Any worked solutions are also welcome!

    Thankyou!
    Follow Math Help Forum on Facebook and Google+

  2. #2
    MHF Contributor
    Joined
    May 2010
    Posts
    1,034
    Thanks
    28

    Re: Proving Maximum Likelihood Estimator (MLE) of Theta is the sample mean

    The likelihood of a sample of independent x values is: (I assume there are n values in total)

    $\displaystyle L(\theta) = \prod_{i=1}^{n} \left[ \frac{\left( \theta^{x_i} \right) e^{-\theta}}{{x_i}!} \right]$

    use log likelihood...

    $\displaystyle Log L(\theta) = \ln \prod_{i} \left(\frac{\left( \theta^x \right) e^{-\theta}}{x!} \right)$

    simplify using log rules
    $\displaystyle Log L(\theta) = \sum_i \ln \left[\frac{\left( \theta^x \right) e^{-\theta}}{x!} \right]$

    $\displaystyle Log L(\theta) = \sum_i \left[\ln(\theta^x) + \ln(e^{-\theta}) + \ln(x!) \right]$
    $\displaystyle Log L(\theta) = \sum_i \left[x\ln(\theta) -\theta + \ln(x!) \right]$

    Simplify the sums
    $\displaystyle Log L(\theta) = \sum_i x\ln(\theta) -\sum_i \theta + \sum_i \ln(x!) $
    $\displaystyle Log L(\theta) = \sum_i x\ln(\theta) -\sum_i \theta + \sum_i \ln(x!) $
    $\displaystyle Log L(\theta) = \ln(\theta)\sum_i x -n \theta + \sum_i \ln(x!) $


    try and finish yourself from there. if stuck, the solution is in the spoiler.
    Spoiler:

    Differentiate with respect to theta

    $\displaystyle \frac{d(Log L(\theta))}{d \theta} = \frac{\sum_i x}{\theta} -n + 0 $

    Set the derivative equal to 0 and solve to geth the MLE ($\displaystyle \hat{\theta}}$)

    $\displaystyle 0 = \frac{\sum_i x}{\hat{\theta}} - n$
    $\displaystyle \hat{\theta} = \frac{\sum_i x}{n}$
    $\displaystyle \hat{\theta} = \bar{x}$
    Your professor may expect you to prove that the turning point is a maximum, which you can do using whatever methods you prefer.
    Last edited by SpringFan25; Mar 27th 2013 at 03:58 PM.
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. Replies: 1
    Last Post: Dec 13th 2012, 08:02 PM
  2. Maximum Likelihood Estimator
    Posted in the Advanced Statistics Forum
    Replies: 1
    Last Post: Jan 26th 2010, 04:53 PM
  3. Maximum Likelihood Estimator
    Posted in the Advanced Statistics Forum
    Replies: 0
    Last Post: Nov 24th 2009, 04:20 PM
  4. Maximum likelihood estimator
    Posted in the Advanced Statistics Forum
    Replies: 3
    Last Post: Oct 20th 2009, 01:53 AM
  5. Maximum likelihood estimator
    Posted in the Advanced Statistics Forum
    Replies: 8
    Last Post: May 1st 2009, 01:04 AM

Search Tags


/mathhelpforum @mathhelpforum