1. ## Maximum likelihood estimator

5. A random sample of size n is taken from a distribution with pdf:

f(x;θ) = θx^(θ-1 ) 0 < x < 1; θ > 0.

Find the maximum likelihood estimator of θ.

I tried teaching myself out of the book but it has no examples. Can someone run me through a step by step way of solving this? I have nothing to reference and my notes have 2 lines on it.

2. Originally Posted by wolverine21
5. A random sample of size n is taken from a distribution with pdf:

f(x;θ) = θx^(θ-1 ) 0 < x < 1; θ > 0.

Find the maximum likelihood estimator of θ.

I tried teaching myself out of the book but it has no examples. Can someone run me through a step by step way of solving this? I have nothing to reference and my notes have 2 lines on it.
The likelihood function $\displaystyle L(x_1, x_2, \, .... \, x_n)$ is defined to be the joint pdf of the random variables $\displaystyle X_1, \, X_2, \, .... \, X_n$.

Therefore $\displaystyle L(x_1, x_2, \, .... \, x_n) = \left( \theta x_1^{\theta - 1} \right) \cdot \left( \theta x_2^{\theta - 1} \right) \cdot .... \left( \theta x_n^{\theta - 1} \right) = \theta^n \, ( x_1 \cdot x_2 \cdot .... x_n)^{\theta - 1}$.

The maximum likelihood estimate of $\displaystyle \theta$ is the value of $\displaystyle \theta$ that maximises $\displaystyle L(x_1, x_2, \, .... \, x_n)$.

Since $\displaystyle \ln L$ is a monotonically increasing function of L, both L and $\displaystyle \ln L$ will be a maximum for the same value of $\displaystyle \theta$. It's obviously easier in this instance to find the value of $\displaystyle \theta$ that maximises $\displaystyle \ln L$:

$\displaystyle \ln L = n \ln \theta + (\theta - 1) \ln (x_1 \cdot x_2 \cdot .... x_n)$

$\displaystyle \Rightarrow \frac{d \ln L}{d\theta} = \frac{n}{\theta} + \ln (x_1 \cdot x_2 \cdot .... x_n)$.

Now solve $\displaystyle \frac{dL}{d\theta} = 0$ for $\displaystyle \theta$.

3. Please see my computation.. I don't know if it is correct. I have a question, Is there a possibility that it we can a negative in MLE?...thanks!