1. ## Poission estimators problem

Hi all, I need to determine the variance of the estimator for Poission mean parameter $\displaystyle \lambda$

Let $\displaystyle \theta = { 1 \over n} {\sum_{i=1}^{n} } X_i$

From our knowledge of Poission rv,
$\displaystyle E[X_i] = \lambda$

$\displaystyle E[{\sum_{i=1}^{n} }X_i] = n{\lambda}$

similarly,
$\displaystyle E[{X_i}^2] = \lambda + \lambda^2$

$\displaystyle E[ {\sum_{i=1}^{n}} {X_i}^2 ] = n{\lambda} + n{\lambda^2}$

To find the variance, we make use of the above results:

$\displaystyle V[ \theta] = {1 \over n^2} V[{\sum_{i=1}^{n}} {X_i}] = {1 \over n^2} (E[ {\sum_{i=1}^{n}} {X_i}^2] - E[{\sum_{i=1}^{n} }X_i]^2)$
$\displaystyle = { 1 \over n}{\lambda} + {1 \over n}{\lambda^2} - {\lambda^2}$

but its supposed to equal $\displaystyle {1 \over n} \lambda$ which makes it an unbiased estimator. But the above equations don't arrive at this conclusion. Where am I wrong?

2. Hi !
Originally Posted by chopet
Hi all, I need to determine the variance of the estimator for Poission mean parameter $\displaystyle \lambda$

Let $\displaystyle \theta = { 1 \over n} {\sum_{i=1}^{n} } X_i$

From our knowledge of Poission rv,
$\displaystyle E[X_i] = \lambda$

$\displaystyle E[{\sum_{i=1}^{n} }X_i] = n{\lambda}$

similarly,
$\displaystyle E[{X_i}^2] = \lambda + \lambda^2$

$\displaystyle E[ {\sum_{i=1}^{n}} {X_i}^2 ] = n{\lambda} + n{\lambda^2}$

To find the variance, we make use of the above results:

$\displaystyle V[ \theta] = {1 \over n^2} V[{\sum_{i=1}^{n}} {X_i}] = {1 \over n^2} (E[ {\sum_{i=1}^{n}} {X_i}^2] - E[{\sum_{i=1}^{n} }X_i]^2)$
$\displaystyle = { 1 \over n}{\lambda} + {1 \over n}{\lambda^2} - {\lambda^2}$

but its supposed to equal $\displaystyle {1 \over n} \lambda$ which makes it an unbiased estimator. But the above equations don't arrive at this conclusion. Where am I wrong?
It think I've figured out your mistake.

Imagine that $\displaystyle Y=\sum_{i=1}^n X_i$

So $\displaystyle V[\theta]=V \left[\frac 1n Y\right]=\frac 1{n^2} V [Y]=\frac 1{n^2} \left(E[Y^2]-(E[Y])^2\right)$

$\displaystyle \left(E[Y^2]-(E[Y])^2\right)=\left(E \left[{\color{red}\left(\sum_{i=1}^n X_i\right)^2}\right]-\left(E \left[\sum_{i=1}^n X_i\right]\right)^2 \right)$

Do you see how the red part makes all the difference... ?

3. Originally Posted by chopet
Hi all, I need to determine the variance of the estimator for Poission mean parameter $\displaystyle \lambda$

Let $\displaystyle \theta = { 1 \over n} {\sum_{i=1}^{n} } X_i$

From our knowledge of Poission rv,
$\displaystyle E[X_i] = \lambda$

$\displaystyle E[{\sum_{i=1}^{n} }X_i] = n{\lambda}$

similarly,
$\displaystyle E[{X_i}^2] = \lambda + \lambda^2$

$\displaystyle E[ {\sum_{i=1}^{n}} {X_i}^2 ] = n{\lambda} + n{\lambda^2}$

To find the variance, we make use of the above results:

$\displaystyle V[ \theta] = {1 \over n^2} V[{\sum_{i=1}^{n}} {X_i}] = {1 \over n^2} (E[ {\sum_{i=1}^{n}} {X_i}^2] - E[{\sum_{i=1}^{n} }X_i]^2)$
$\displaystyle = { 1 \over n}{\lambda} + {1 \over n}{\lambda^2} - {\lambda^2}$

but its supposed to equal $\displaystyle {1 \over n} \lambda$ which makes it an unbiased estimator. But the above equations don't arrive at this conclusion. Where am I wrong?
Why make such heavy weather of things ......

If $\displaystyle \theta = { 1 \over n} {\sum_{i=1}^{n} } X_i$ and the $\displaystyle X_i$ are i.i.d. random variables with $\displaystyle Var(X_i) = \sigma^2$ then it's simple to prove that $\displaystyle Var(\theta) = \frac{1}{n^2} \left( n \sigma^2 \right) = \frac{\sigma^2}{n}$.

If $\displaystyle X_i$ is a Poisson random variable then you know that $\displaystyle Var(X_i) = \lambda$.

Therefore $\displaystyle Var(\theta) = \frac{\lambda}{n}$.