# Thread: Maximum Likelihood Estimator and Bias

1. ## Maximum Likelihood Estimator and Bias

A R.V. X has pdf
How would I find the mle of lambda based on realisations of x1..?

2. Hello,

Let $\displaystyle (x_1,\dots,x_n)$ a sample of iid rv with pdf f.

We have the global pdf :

$\displaystyle f(x_1,\dots,x_n)=\prod_{i=1}^n \frac 2\lambda \cdot x_i \cdot \exp\left(-\frac{x_i^2}{\lambda}\right)$

$\displaystyle \ln f(x_1,\dots,x_n)=n (\ln(2)-\ln(\lambda))-\sum_{i=1}^n \frac{x_i^2}{\lambda}$

Its derivative with respect to $\displaystyle \lambda$ is :

$\displaystyle -\frac n\lambda+\frac{1}{\lambda^2}\sum_{i=1}^n x_i^2$

The MLE $\displaystyle \bar\lambda$ is such that :

$\displaystyle 0=-\frac{n}{\bar\lambda}+\frac{1}{\bar\lambda^2}\sum_ {i=1}^n x_i^2$

--> $\displaystyle \bar\lambda=0 \text{ or } \bar\lambda=\frac 1n\sum_{i=1}^n x_i^2$

But obviously, $\displaystyle \lambda\neq 0$.

So $\displaystyle \boxed{\bar\lambda=\frac 1n\sum_{i=1}^n x_i^2}$

-------------------------------------------
To see if it's unbiased, prove that $\displaystyle \mathbb{E}(\bar\lambda)=\lambda$

$\displaystyle \mathbb{E}(\bar\lambda)=\mathbb{E}\left(\frac 1n\sum_{i=1}^n x_i^2\right)=\mathbb{E}(x_1^2)$ because they're identically distributed.

And $\displaystyle \mathbb{E}(x_1^2)=\int_0^\infty x^2 f(x) ~dx$

Compute this integral and prove (or disprove) it equals $\displaystyle \lambda$