Hi guys. I have a question and I hope someone can help me out

Let X1,.....Xn be a random sample from gamma(a,b) with a known.

Find the best unbiased estimator of 1/b

Waiting for your response as soon as you can

Thanks in advance

Printable View

- May 5th 2009, 07:19 PMsurvivor1980Find the best unbiased estimator of 1/b of gamma dist.
Hi guys. I have a question and I hope someone can help me out

Let X1,.....Xn be a random sample from gamma(a,b) with a known.

Find the best unbiased estimator of 1/b

Waiting for your response as soon as you can

Thanks in advance - May 5th 2009, 08:34 PMmr fantastic
Read this: Best Unbiased Estimators

- May 5th 2009, 09:45 PMmatheagle
1 I need to know how you're writing your gamma density. Sometimes the b is in the numerator, sometimes it in the denomiator.

2 is best=min variance? Hence UMVUE - May 5th 2009, 10:31 PMsurvivor1980
Thank you so much guys for speedy response.

Let X₁. X₂, X₃,…, Xn be a random sample from gamma(α, Β) with α known. Find the best unbiased estimator of 1/ Β

Note 1-α = alpha, Β=beta

2- Yes the best unbiased estimatore is the same as UMVUE

thanks - May 6th 2009, 05:20 PMmatheagle
I cannot help you until I see how you write your density.

I said that yesterday, nor do I know what '1-α = alpha, Β=beta' means. - May 6th 2009, 05:32 PMsurvivor1980
Hi all

This is the function

http://upload.wikimedia.org/math/6/4...191f5af62f.png - May 6th 2009, 05:34 PMmatheagle
OK, I'll work with that, BUT that is not how most people write the gamma density. I place beta in the denomiator.

The likelihood function is .

Thus with known we have suficient for .

And since we have UMVUE for .

I could have done this yesterday, but the mean of my gamma's is

and getting an unbiased estimator of is a lot harder in that case.

And I wasn't going to do this until I knew how you were writing your density. - February 28th 2010, 10:22 AMricer
- April 12th 2010, 08:06 PMpalabine
It can be shown that Sum(Xi) is not only sufficient but complete for beta (as w(beta) of exponential function contains an open set), so try 1/sum(xi) to estimate 1/beta. Lehmann-Scheffe tells us that an unbiased estimator that is a function of a complete statistic is the best unbiased estimator.

It can be shown that, assuming iid xi, Sum(xi) is distributed as Gamma(n*alpha,beta).

Given the above, let Y=(1/sum(xi)). Y is distributed as an inverted gamma(n*alpha, 1/beta) with mean=(1/beta)/(n*alpha-1). Thus, E(1/sum(xi)) = E(Y) = (1/beta)/(n*alpha-1), which is obviously a biased estimator of 1/beta.

Now let**T = (n*alpha - 1)/sum(xi)**be the unbiased estimator of 1/beta which is a function of a complete statistic. Thus, T is the best unbiased estimator of 1/beta, but it can be shown that it does not attain the lower Cramer-Rao bound. - April 12th 2010, 08:32 PMricer
Thanks(Clapping)

- April 12th 2010, 09:42 PMmatheagle
- May 3rd 2011, 03:45 AMmnazam
- May 3rd 2011, 04:19 AMMoo
- May 3rd 2011, 05:15 AMmnazam
I see. Thanks for the info. If the pdf can be written like this, it'd be so easy. Like we can show that it is an exponential family thus it is a complete sufficient estimator for beta. Then we can use MLE to get 1/beta(^). Then to show that 1/beta(^) is an unbiased estimator we calculate the mean of it and we get 1/beta. So 1/beta(^) in unbiased estimator.

So we can conclude that 1/beta(^) is the best estimator. Is it right?