Results 1 to 2 of 2

Thread: estimation

  1. #1
    Banned
    Joined
    May 2009
    Posts
    20

    estimation

    Let $\displaystyle f(x;\theta)=\frac{1}{\theta}x^{\frac{1-\theta}{\theta}}$, 0 < x < 1, 0 < $\displaystyle \theta$ < $\displaystyle \infty$

    a. show that the maximum likelihood estimator of $\displaystyle \theta$ is MLE$\displaystyle \theta=-(1/n) \Sigma^{n}_{i=1} lnX_{i}$

    b. show that $\displaystyle E(MLE\theta)=\theta$ and thus $\displaystyle MLE\theta$ is an unbiased estimator of $\displaystyle \theta$

    Thank you for any help you can give me!

    *I used MLE rather because I did not know how to make the estimator mark above the appropriate symbol
    Follow Math Help Forum on Facebook and Google+

  2. #2
    Moo
    Moo is offline
    A Cute Angle Moo's Avatar
    Joined
    Mar 2008
    From
    P(I'm here)=1/3, P(I'm there)=t+1/3
    Posts
    5,618
    Thanks
    6
    Hello,
    Quote Originally Posted by redpack View Post
    Let $\displaystyle f(x;\theta)=\frac{1}{\theta}x^{\frac{1-\theta}{\theta}}$, 0 < x < 1, 0 < $\displaystyle \theta$ < $\displaystyle \infty$

    a. show that the maximum likelihood estimator of $\displaystyle \theta$ is MLE$\displaystyle \theta=-(1/n) \Sigma^{n}_{i=1} lnX_{i}$
    $\displaystyle g(x,\theta)=\prod_{i=1}^n f(x_i,\theta)$ (where $\displaystyle x=(x_1,\dots,x_n)$)

    $\displaystyle g(x,\theta)=\frac{1}{\theta^n} \left(\prod_{i=1}^n x_i\right)^{\frac{1-\theta}{\theta}}$

    --------> Find $\displaystyle \hat\theta$ (see the code) that makes $\displaystyle \frac{\partial \ln(g(x,\theta))}{\partial \theta}=0$

    $\displaystyle \ln(g(x,\theta))=-n\ln(\theta)+\frac{1-\theta}{\theta} \ln\left(\prod_{i=1}^n x_i\right)=-n\ln(\theta)+\frac{1-\theta}{\theta} \sum_{i=1}^n \ln(x_i)$

    Thus
    $\displaystyle \frac{\partial \ln(g(x,\theta))}{\partial \theta}=-\frac n\theta-\frac{1}{\theta^2} \sum_{i=1}^n \ln(x_i)$

    and the result follows.

    b. show that $\displaystyle E(MLE\theta)=\theta$ and thus $\displaystyle MLE\theta$ is an unbiased estimator of $\displaystyle \theta$
    Since the $\displaystyle X_i$ are identically distributed, $\displaystyle \ln(X_i)$ are identically distributed, and hence their expectation is identical. So we have :

    $\displaystyle \mathbb{E}(\hat{\theta})=-\frac 1n \mathbb{E}\left(\sum_{i=1}^n \ln(X_i)\right)=-\frac 1n \sum_{i=1}^n \mathbb{E}(\ln(X_i))=-\mathbb{E}(\ln(X_1))$

    Now, how to calculate $\displaystyle \mathbb{E}(\ln(X_1))$ ?

    By the law of the unconscious statistician, we have :

    $\displaystyle \mathbb{E}(\ln(X_1))=\frac 1\theta \int_0^1 \ln(x)x^{\frac{1-\theta}{\theta}} ~dx$

    You can see that this is more or less the derivative of $\displaystyle x^{\frac{1-\theta}{\theta}}$, except that there is a constant (a constant with respect to x, so it can be in terms of theta).

    So substitute $\displaystyle u=x^{\frac{1-\theta}{\theta}}$

    From a previous calculation (that you would have done), $\displaystyle du=-\frac{1}{\theta^2} \ln(x) x^{\frac{1-\theta}{\theta}} ~dx$

    And $\displaystyle x=0 \Rightarrow u=0$, $\displaystyle x=1\Rightarrow u=1$


    So now the integral is just
    $\displaystyle \mathbb{E}(\ln(X_1))=-\frac{\theta^2}{\theta} \int_0^1 du=-\theta$

    And finally $\displaystyle \boxed{\mathbb{E}(\hat{\theta})=\theta}$


    Yay !!!!!!!!!!!!!!


    I hope this helps (there are several steps left to be filled by yourself ^^)
    Last edited by Moo; May 29th 2009 at 11:16 AM. Reason: simplified the first part...
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. Estimation
    Posted in the Advanced Statistics Forum
    Replies: 4
    Last Post: Feb 3rd 2011, 07:07 PM
  2. Estimation
    Posted in the Advanced Statistics Forum
    Replies: 2
    Last Post: Sep 21st 2010, 05:55 AM
  3. Estimation
    Posted in the Advanced Statistics Forum
    Replies: 1
    Last Post: Apr 12th 2010, 01:42 AM
  4. MLE estimation
    Posted in the Advanced Statistics Forum
    Replies: 4
    Last Post: Mar 18th 2010, 05:38 PM
  5. Estimation
    Posted in the Statistics Forum
    Replies: 0
    Last Post: Apr 22nd 2009, 03:19 AM

Search Tags


/mathhelpforum @mathhelpforum