# Maximum likelihood estimator

• Nov 23rd 2008, 11:29 PM
wolverine21
Maximum likelihood estimator
5. A random sample of size n is taken from a distribution with pdf:

f(x;θ) = θx^(θ-1 ) 0 < x < 1; θ > 0.

Find the maximum likelihood estimator of θ.

I tried teaching myself out of the book but it has no examples. Can someone run me through a step by step way of solving this? I have nothing to reference and my notes have 2 lines on it.
• Nov 24th 2008, 12:26 AM
mr fantastic
Quote:

Originally Posted by wolverine21
5. A random sample of size n is taken from a distribution with pdf:

f(x;θ) = θx^(θ-1 ) 0 < x < 1; θ > 0.

Find the maximum likelihood estimator of θ.

I tried teaching myself out of the book but it has no examples. Can someone run me through a step by step way of solving this? I have nothing to reference and my notes have 2 lines on it.

The likelihood function $L(x_1, x_2, \, .... \, x_n)$ is defined to be the joint pdf of the random variables $X_1, \, X_2, \, .... \, X_n$.

Therefore $L(x_1, x_2, \, .... \, x_n) = \left( \theta x_1^{\theta - 1} \right) \cdot \left( \theta x_2^{\theta - 1} \right) \cdot .... \left( \theta x_n^{\theta - 1} \right) = \theta^n \, ( x_1 \cdot x_2 \cdot .... x_n)^{\theta - 1}$.

The maximum likelihood estimate of $\theta$ is the value of $\theta$ that maximises $L(x_1, x_2, \, .... \, x_n)$.

Since $\ln L$ is a monotonically increasing function of L, both L and $\ln L$ will be a maximum for the same value of $\theta$. It's obviously easier in this instance to find the value of $\theta$ that maximises $\ln L$:

$\ln L = n \ln \theta + (\theta - 1) \ln (x_1 \cdot x_2 \cdot .... x_n)$

$\Rightarrow \frac{d \ln L}{d\theta} = \frac{n}{\theta} + \ln (x_1 \cdot x_2 \cdot .... x_n)$.

Now solve $\frac{dL}{d\theta} = 0$ for $\theta$.
• Jan 11th 2009, 03:13 AM