1. ## Poission estimators problem

Hi all, I need to determine the variance of the estimator for Poission mean parameter $\lambda$

Let $\theta = { 1 \over n} {\sum_{i=1}^{n} } X_i$

From our knowledge of Poission rv,
$E[X_i] = \lambda$

$E[{\sum_{i=1}^{n} }X_i] = n{\lambda}$

similarly,
$E[{X_i}^2] = \lambda + \lambda^2$

$E[ {\sum_{i=1}^{n}} {X_i}^2 ] = n{\lambda} + n{\lambda^2}$

To find the variance, we make use of the above results:

$V[ \theta] = {1 \over n^2} V[{\sum_{i=1}^{n}} {X_i}] = {1 \over n^2} (E[ {\sum_{i=1}^{n}} {X_i}^2] - E[{\sum_{i=1}^{n} }X_i]^2)$
$= { 1 \over n}{\lambda} + {1 \over n}{\lambda^2} - {\lambda^2}$

but its supposed to equal ${1 \over n} \lambda$ which makes it an unbiased estimator. But the above equations don't arrive at this conclusion. Where am I wrong?

2. Hi !
Originally Posted by chopet
Hi all, I need to determine the variance of the estimator for Poission mean parameter $\lambda$

Let $\theta = { 1 \over n} {\sum_{i=1}^{n} } X_i$

From our knowledge of Poission rv,
$E[X_i] = \lambda$

$E[{\sum_{i=1}^{n} }X_i] = n{\lambda}$

similarly,
$E[{X_i}^2] = \lambda + \lambda^2$

$E[ {\sum_{i=1}^{n}} {X_i}^2 ] = n{\lambda} + n{\lambda^2}$

To find the variance, we make use of the above results:

$V[ \theta] = {1 \over n^2} V[{\sum_{i=1}^{n}} {X_i}] = {1 \over n^2} (E[ {\sum_{i=1}^{n}} {X_i}^2] - E[{\sum_{i=1}^{n} }X_i]^2)$
$= { 1 \over n}{\lambda} + {1 \over n}{\lambda^2} - {\lambda^2}$

but its supposed to equal ${1 \over n} \lambda$ which makes it an unbiased estimator. But the above equations don't arrive at this conclusion. Where am I wrong?
It think I've figured out your mistake.

Imagine that $Y=\sum_{i=1}^n X_i$

So $V[\theta]=V \left[\frac 1n Y\right]=\frac 1{n^2} V [Y]=\frac 1{n^2} \left(E[Y^2]-(E[Y])^2\right)$

$\left(E[Y^2]-(E[Y])^2\right)=\left(E \left[{\color{red}\left(\sum_{i=1}^n X_i\right)^2}\right]-\left(E \left[\sum_{i=1}^n X_i\right]\right)^2 \right)$

Do you see how the red part makes all the difference... ?

3. Originally Posted by chopet
Hi all, I need to determine the variance of the estimator for Poission mean parameter $\lambda$

Let $\theta = { 1 \over n} {\sum_{i=1}^{n} } X_i$

From our knowledge of Poission rv,
$E[X_i] = \lambda$

$E[{\sum_{i=1}^{n} }X_i] = n{\lambda}$

similarly,
$E[{X_i}^2] = \lambda + \lambda^2$

$E[ {\sum_{i=1}^{n}} {X_i}^2 ] = n{\lambda} + n{\lambda^2}$

To find the variance, we make use of the above results:

$V[ \theta] = {1 \over n^2} V[{\sum_{i=1}^{n}} {X_i}] = {1 \over n^2} (E[ {\sum_{i=1}^{n}} {X_i}^2] - E[{\sum_{i=1}^{n} }X_i]^2)$
$= { 1 \over n}{\lambda} + {1 \over n}{\lambda^2} - {\lambda^2}$

but its supposed to equal ${1 \over n} \lambda$ which makes it an unbiased estimator. But the above equations don't arrive at this conclusion. Where am I wrong?
Why make such heavy weather of things ......

If $\theta = { 1 \over n} {\sum_{i=1}^{n} } X_i$ and the $X_i$ are i.i.d. random variables with $Var(X_i) = \sigma^2$ then it's simple to prove that $Var(\theta) = \frac{1}{n^2} \left( n \sigma^2 \right) = \frac{\sigma^2}{n}$.

If $X_i$ is a Poisson random variable then you know that $Var(X_i) = \lambda$.

Therefore $Var(\theta) = \frac{\lambda}{n}$.