1. ## estimation

Let $\displaystyle f(x;\theta)=\frac{1}{\theta}x^{\frac{1-\theta}{\theta}}$, 0 < x < 1, 0 < $\displaystyle \theta$ < $\displaystyle \infty$

a. show that the maximum likelihood estimator of $\displaystyle \theta$ is MLE$\displaystyle \theta=-(1/n) \Sigma^{n}_{i=1} lnX_{i}$

b. show that $\displaystyle E(MLE\theta)=\theta$ and thus $\displaystyle MLE\theta$ is an unbiased estimator of $\displaystyle \theta$

*I used MLE rather because I did not know how to make the estimator mark above the appropriate symbol

2. Hello,
Originally Posted by redpack
Let $\displaystyle f(x;\theta)=\frac{1}{\theta}x^{\frac{1-\theta}{\theta}}$, 0 < x < 1, 0 < $\displaystyle \theta$ < $\displaystyle \infty$

a. show that the maximum likelihood estimator of $\displaystyle \theta$ is MLE$\displaystyle \theta=-(1/n) \Sigma^{n}_{i=1} lnX_{i}$
$\displaystyle g(x,\theta)=\prod_{i=1}^n f(x_i,\theta)$ (where $\displaystyle x=(x_1,\dots,x_n)$)

$\displaystyle g(x,\theta)=\frac{1}{\theta^n} \left(\prod_{i=1}^n x_i\right)^{\frac{1-\theta}{\theta}}$

--------> Find $\displaystyle \hat\theta$ (see the code) that makes $\displaystyle \frac{\partial \ln(g(x,\theta))}{\partial \theta}=0$

$\displaystyle \ln(g(x,\theta))=-n\ln(\theta)+\frac{1-\theta}{\theta} \ln\left(\prod_{i=1}^n x_i\right)=-n\ln(\theta)+\frac{1-\theta}{\theta} \sum_{i=1}^n \ln(x_i)$

Thus
$\displaystyle \frac{\partial \ln(g(x,\theta))}{\partial \theta}=-\frac n\theta-\frac{1}{\theta^2} \sum_{i=1}^n \ln(x_i)$

and the result follows.

b. show that $\displaystyle E(MLE\theta)=\theta$ and thus $\displaystyle MLE\theta$ is an unbiased estimator of $\displaystyle \theta$
Since the $\displaystyle X_i$ are identically distributed, $\displaystyle \ln(X_i)$ are identically distributed, and hence their expectation is identical. So we have :

$\displaystyle \mathbb{E}(\hat{\theta})=-\frac 1n \mathbb{E}\left(\sum_{i=1}^n \ln(X_i)\right)=-\frac 1n \sum_{i=1}^n \mathbb{E}(\ln(X_i))=-\mathbb{E}(\ln(X_1))$

Now, how to calculate $\displaystyle \mathbb{E}(\ln(X_1))$ ?

By the law of the unconscious statistician, we have :

$\displaystyle \mathbb{E}(\ln(X_1))=\frac 1\theta \int_0^1 \ln(x)x^{\frac{1-\theta}{\theta}} ~dx$

You can see that this is more or less the derivative of $\displaystyle x^{\frac{1-\theta}{\theta}}$, except that there is a constant (a constant with respect to x, so it can be in terms of theta).

So substitute $\displaystyle u=x^{\frac{1-\theta}{\theta}}$

From a previous calculation (that you would have done), $\displaystyle du=-\frac{1}{\theta^2} \ln(x) x^{\frac{1-\theta}{\theta}} ~dx$

And $\displaystyle x=0 \Rightarrow u=0$, $\displaystyle x=1\Rightarrow u=1$

So now the integral is just
$\displaystyle \mathbb{E}(\ln(X_1))=-\frac{\theta^2}{\theta} \int_0^1 du=-\theta$

And finally $\displaystyle \boxed{\mathbb{E}(\hat{\theta})=\theta}$

Yay !!!!!!!!!!!!!!

I hope this helps (there are several steps left to be filled by yourself ^^)