# estimation

• May 28th 2009, 09:17 PM
redpack
estimation
Let $f(x;\theta)=\frac{1}{\theta}x^{\frac{1-\theta}{\theta}}$, 0 < x < 1, 0 < $\theta$ < $\infty$

a. show that the maximum likelihood estimator of $\theta$ is MLE $\theta=-(1/n) \Sigma^{n}_{i=1} lnX_{i}$

b. show that $E(MLE\theta)=\theta$ and thus $MLE\theta$ is an unbiased estimator of $\theta$

*I used MLE rather because I did not know how to make the estimator mark above the appropriate symbol
• May 28th 2009, 11:50 PM
Moo
Hello,
Quote:

Originally Posted by redpack
Let $f(x;\theta)=\frac{1}{\theta}x^{\frac{1-\theta}{\theta}}$, 0 < x < 1, 0 < $\theta$ < $\infty$

a. show that the maximum likelihood estimator of $\theta$ is MLE $\theta=-(1/n) \Sigma^{n}_{i=1} lnX_{i}$

$g(x,\theta)=\prod_{i=1}^n f(x_i,\theta)$ (where $x=(x_1,\dots,x_n)$)

$g(x,\theta)=\frac{1}{\theta^n} \left(\prod_{i=1}^n x_i\right)^{\frac{1-\theta}{\theta}}$

--------> Find $\hat\theta$ (see the code) that makes $\frac{\partial \ln(g(x,\theta))}{\partial \theta}=0$

$\ln(g(x,\theta))=-n\ln(\theta)+\frac{1-\theta}{\theta} \ln\left(\prod_{i=1}^n x_i\right)=-n\ln(\theta)+\frac{1-\theta}{\theta} \sum_{i=1}^n \ln(x_i)$

Thus
$\frac{\partial \ln(g(x,\theta))}{\partial \theta}=-\frac n\theta-\frac{1}{\theta^2} \sum_{i=1}^n \ln(x_i)$

and the result follows.

Quote:

b. show that $E(MLE\theta)=\theta$ and thus $MLE\theta$ is an unbiased estimator of $\theta$
Since the $X_i$ are identically distributed, $\ln(X_i)$ are identically distributed, and hence their expectation is identical. So we have :

$\mathbb{E}(\hat{\theta})=-\frac 1n \mathbb{E}\left(\sum_{i=1}^n \ln(X_i)\right)=-\frac 1n \sum_{i=1}^n \mathbb{E}(\ln(X_i))=-\mathbb{E}(\ln(X_1))$

Now, how to calculate $\mathbb{E}(\ln(X_1))$ ?

By the law of the unconscious statistician, we have :

$\mathbb{E}(\ln(X_1))=\frac 1\theta \int_0^1 \ln(x)x^{\frac{1-\theta}{\theta}} ~dx$

You can see that this is more or less the derivative of $x^{\frac{1-\theta}{\theta}}$, except that there is a constant (a constant with respect to x, so it can be in terms of theta).

So substitute $u=x^{\frac{1-\theta}{\theta}}$

From a previous calculation (that you would have done), $du=-\frac{1}{\theta^2} \ln(x) x^{\frac{1-\theta}{\theta}} ~dx$

And $x=0 \Rightarrow u=0$, $x=1\Rightarrow u=1$

So now the integral is just
$\mathbb{E}(\ln(X_1))=-\frac{\theta^2}{\theta} \int_0^1 du=-\theta$

And finally $\boxed{\mathbb{E}(\hat{\theta})=\theta}$

Yay !!!!!!!!!!!!!!

I hope this helps :D (there are several steps left to be filled by yourself ^^)