Results 1 to 2 of 2

Math Help - estimation

  1. #1
    Banned
    Joined
    May 2009
    Posts
    20

    estimation

    Let f(x;\theta)=\frac{1}{\theta}x^{\frac{1-\theta}{\theta}}, 0 < x < 1, 0 < \theta < \infty

    a. show that the maximum likelihood estimator of \theta is MLE \theta=-(1/n) \Sigma^{n}_{i=1} lnX_{i}

    b. show that E(MLE\theta)=\theta and thus MLE\theta is an unbiased estimator of \theta

    Thank you for any help you can give me!

    *I used MLE rather because I did not know how to make the estimator mark above the appropriate symbol
    Follow Math Help Forum on Facebook and Google+

  2. #2
    Moo
    Moo is offline
    A Cute Angle Moo's Avatar
    Joined
    Mar 2008
    From
    P(I'm here)=1/3, P(I'm there)=t+1/3
    Posts
    5,618
    Thanks
    6
    Hello,
    Quote Originally Posted by redpack View Post
    Let f(x;\theta)=\frac{1}{\theta}x^{\frac{1-\theta}{\theta}}, 0 < x < 1, 0 < \theta < \infty

    a. show that the maximum likelihood estimator of \theta is MLE \theta=-(1/n) \Sigma^{n}_{i=1} lnX_{i}
    g(x,\theta)=\prod_{i=1}^n f(x_i,\theta) (where x=(x_1,\dots,x_n))

    g(x,\theta)=\frac{1}{\theta^n} \left(\prod_{i=1}^n x_i\right)^{\frac{1-\theta}{\theta}}

    --------> Find \hat\theta (see the code) that makes \frac{\partial \ln(g(x,\theta))}{\partial \theta}=0

    \ln(g(x,\theta))=-n\ln(\theta)+\frac{1-\theta}{\theta} \ln\left(\prod_{i=1}^n x_i\right)=-n\ln(\theta)+\frac{1-\theta}{\theta} \sum_{i=1}^n \ln(x_i)

    Thus
    \frac{\partial \ln(g(x,\theta))}{\partial \theta}=-\frac n\theta-\frac{1}{\theta^2} \sum_{i=1}^n \ln(x_i)

    and the result follows.

    b. show that E(MLE\theta)=\theta and thus MLE\theta is an unbiased estimator of \theta
    Since the X_i are identically distributed, \ln(X_i) are identically distributed, and hence their expectation is identical. So we have :

    \mathbb{E}(\hat{\theta})=-\frac 1n \mathbb{E}\left(\sum_{i=1}^n \ln(X_i)\right)=-\frac 1n \sum_{i=1}^n \mathbb{E}(\ln(X_i))=-\mathbb{E}(\ln(X_1))

    Now, how to calculate \mathbb{E}(\ln(X_1)) ?

    By the law of the unconscious statistician, we have :

    \mathbb{E}(\ln(X_1))=\frac 1\theta \int_0^1 \ln(x)x^{\frac{1-\theta}{\theta}} ~dx

    You can see that this is more or less the derivative of x^{\frac{1-\theta}{\theta}}, except that there is a constant (a constant with respect to x, so it can be in terms of theta).

    So substitute u=x^{\frac{1-\theta}{\theta}}

    From a previous calculation (that you would have done), du=-\frac{1}{\theta^2} \ln(x) x^{\frac{1-\theta}{\theta}} ~dx

    And x=0 \Rightarrow u=0, x=1\Rightarrow u=1


    So now the integral is just
    \mathbb{E}(\ln(X_1))=-\frac{\theta^2}{\theta} \int_0^1 du=-\theta

    And finally \boxed{\mathbb{E}(\hat{\theta})=\theta}


    Yay !!!!!!!!!!!!!!


    I hope this helps (there are several steps left to be filled by yourself ^^)
    Last edited by Moo; May 29th 2009 at 11:16 AM. Reason: simplified the first part...
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. Estimation
    Posted in the Advanced Statistics Forum
    Replies: 4
    Last Post: February 3rd 2011, 07:07 PM
  2. Estimation
    Posted in the Advanced Statistics Forum
    Replies: 2
    Last Post: September 21st 2010, 05:55 AM
  3. Estimation
    Posted in the Advanced Statistics Forum
    Replies: 1
    Last Post: April 12th 2010, 01:42 AM
  4. MLE estimation
    Posted in the Advanced Statistics Forum
    Replies: 4
    Last Post: March 18th 2010, 05:38 PM
  5. Estimation
    Posted in the Statistics Forum
    Replies: 0
    Last Post: April 22nd 2009, 03:19 AM

Search Tags


/mathhelpforum @mathhelpforum