# Thread: Minimum variance unbiased estimator proof

1. ## Minimum variance unbiased estimator proof

"Show that the mean of a random sample of size n is a minimum variance unbiased estimator of the parameter $\lambda$ of a Poisson distribution."

Here are my steps. Somewhere along the way, I got lost.

1. The distribution is Poisson, so $\mu = \sigma^2 = \lambda$. $E(\overline{X})$, so that the estimator is unbiased is clear.
2. $f(x) = \frac{\lambda ^ x \exp(-\lambda)}{x!}$
3. $\ln f(x) = x \ln \lambda - \ln x! - \lambda$
4. $\{\frac{\partial}{\partial \lambda}[\ln f(x)] \}^2 = \frac{x^2}{\lambda^2} - \frac{2x}{\lambda} + 1$

There's no way that can be right...What did I do wrong??

2. ## Re: Minimum variance unbiased estimator proof

a sample of n i.i.d poisson variates has distribution

we know the maximum likelihood estimate is minimum variance, we need to see if it's unbiased.

$\large p_{\vec{K}}(\vec{k})=\displaystyle{e^{-\lambda} \prod_{i=1}^n}\dfrac {\lambda^{k_i}}{k_i!}$

$\large \ln(p_{\vec{K}}(\vec{k})=\displaystyle{\sum_{i=1}^ n}\left(k_i \ln(\lambda)-\ln(k_i!)-\lambda\right)$

$\large \dfrac \partial {\partial \lambda}\ln(p_{\vec{K}}(\vec{k}))=\dfrac 1 \lambda \displaystyle{\sum_{i=1}^n}\left(k_i - 1\right)$

setting this to zero we get

$\dfrac 1 \lambda \displaystyle{\sum_{i=1}^n}k_i = n$

$\dfrac 1 n \displaystyle{\sum_{i=1}^n}k_i = \lambda$

as $E[k_i]=\lambda$ we see that it is unbiased and thus the mean is our unbiased minimum variance estimate.

3. ## Re: Minimum variance unbiased estimator proof

This is what I get but no guarantee it is correct:

Step 4: x/λ - 1/λ - 1=0, Derivative wrt λ
Step 5: (x^2 - λ) / λx = 1, Put under common denominator, move 1 to RHS
Step 6: (x^2 - λ) = λx, Multiply both by denominator
Step 7: x - λ = λ, Divide by x both sides
Step 8: x= 2 λ, then square this function.

I have the right solution somewhere, if I find it later and this incorrect let you know.
Toni

4. ## Re: Minimum variance unbiased estimator proof

This is brilliant, I see my answer had a goof in it. But shouldn't the function be Sum of Ki * (Lambda -1)? Doesn't affect the final answer but I don't think we mean to multiply the 1 by 1/Lambda as you have there 1/L * ((SUM Ki-1), right? The -1 is just summed over 1 to n so becomes -n then goes to the other side as +n, not n/lambda.

Looks like just minor parens mismatch if I am not mistaken (again! LOL).

5. ## Re: Minimum variance unbiased estimator proof

Originally Posted by TSmarsco
This is what I get but no guarantee it is correct:

Step 4: x/λ - 1/λ - 1=0, Derivative wrt λ
Step 5: (x^2 - λ) / λx = 1, Put under common denominator, move 1 to RHS
Step 6: (x^2 - λ) = λx, Multiply both by denominator
Step 7: x - λ = λ, Divide by x both sides
Step 8: x= 2 λ, then square this function.

I have the right solution somewhere, if I find it later and this incorrect let you know.
Toni
OK I see that's not right, but really all I have to do is take my step 4 equal to zero and go from there? Note, that's a trinomial square, so I could just back it up and say:

5. $\frac{x}{\lambda} - 1 = 0$
6. $\Rightarrow x = \lambda$
7. Observe, $E(\overline{X}) = \mu = \lambda,$ so $\overline{X}$ is an MVUE of $\lambda$.

Right?

,

,

,

,

,

,

,

,

,

# show that mean of a random sample of size n is a minimum variance unbiased estimator of parameter lambda of a poisson population

Click on a term to search for related topics.