Question.

Let $\displaystyle Y_1, ... Y_n$ be a random sample from a Gamma distribution $\displaystyle \Gamma(k, \frac{1}{\theta})$ with density function

$\displaystyle f_y(y)=\frac{{\theta}^k}{(k-1)!}y^{k-1}e^{{-\theta}y}$ for x>0,

$\displaystyle 0$ $\displaystyle otherwise$

where $\displaystyle k\geq1$ is a known integer, and $\displaystyle \theta>0$ is an unknown parameter.

i. Show that the moment generating function of $\displaystyle Y_1+Y_2+ ... +Y_n$ is $\displaystyle M_Y(t)=(1-\frac{t}{\theta})^{-nk}$

ii. Find the constant c such that $\displaystyle c/{\bar}Y$ (that's "c over Y bar", if it does not show up properly on the screen) is an unbiased estimator for $\displaystyle \theta$, where $\displaystyle {\bar}X$ (X bar) is the sample mean. Calculate the vairance of this estimator and compare it with the Cramer-Rao lower bound.

(This is an exact wording from the book, including X bar which appears out of nowhere)

Answer.

(i) $\displaystyle E(e^{t(Y_1+Y_2+...+Y_n)}=E(e^{ntY_i})=[M_Y(t)]^n$ since Ys are iid.

$\displaystyle =[(\frac{\theta}{\theta-t})^k]^n=(\frac{1}{1-\frac{t}{\theta}})^{nk}=(1-\frac{t}{\theta})^{-nk}$

(ii) I need to find $\displaystyle E(c/{\bar}Y)$, and I don't think I can simply substitute the mean in the denominator $\displaystyle E(1/{\bar}Y)$ (expected value of the 1 over Y bar) here, so what else can I do?

I could also try $\displaystyle E(\frac{c}{1/n\Sigma_{i=1}^nY_i})$. Any pointers?