# Thread: Unbiased estimator of the exponential distirbution

1. ## Unbiased estimator of the exponential distirbution

Let $f(x; \theta) = \theta e^{- \theta x}$ and $Y = \Sigma X_i$. What function of Y is an unbiased estimator of theta?

So I know the mean of the exponential distirbutions is $\frac{1}{\theta}$

$E(Y) = E(\Sigma X_i) = \frac{n}{\theta}$

therefore $E(nY^{-1}) = \theta$.

Is this correct? Something about it is bothing me, or am I just being paranoid?

2. well, you should be a bit paranoid

If $E(W)=p$ then it does not follow that $E(W^{-1})=1/p$

So, calculate $E(Y^{-1})$ and see what you get.

3. Am I allowed to use $\frac{\theta^2 Y}{n}$

4. NOPE
you cannot estimate an unknown parameter with itself
Otherwise theta would always be the unbiased estimator of theta

5. So what shall I do?

6. Hello,

Do as you've been told above : find $E[Y^{-1}]$

In order to do that, recall that $E[h(Y)]=\int h(y) g(y) ~dy$, where g is the pdf of Y.
And also see that Y is the sum of n independent rv following an exponential distribution with parameter $\theta$
So its pdf is the one of a gamma distribution $(n,1/\theta)$ (see here : Exponential distribution - Wikipedia, the free encyclopedia)

7. Now I get it. Thank you both for your help.

8. Just to make sure I didnt make any more mistakes:

$\int^{\infty}_o\frac{\theta^n}{\Gamma (n)}y^{n -1 -1}e^{-y \theta} dy = \frac{\theta \Gamma(n-1)}{\Gamma (n)}$ (after a change of variables $t = y \theta$)

So $E(\frac{\Gamma (n)}{\Gamma (n-1)}Y) = \theta$

9. Yeah, looks correct.
But note that $\Gamma(n)=(n-1)!$ and $\Gamma(n-1)=(n-2)!$

So you're left with $E\left[(n-1)Y^{-1}\right]=\theta$

10. Originally Posted by Moo
Hello,

Do as you've been told above : find $E[Y^{-1}]$

I'm still in shock over this.
What a difference from a year ago.

,
,

,

,

,

,

# unbaised estimator of the exponential distribution 4y

Click on a term to search for related topics.