Hello,
For the mgf, you got it wrong.
, so there's no factor (you just got messed up in the algebra)
Question.
Let be a random sample from a Gamma distribution with density function
for x>0,
where is a known integer, and is an unknown parameter.
i. Show that the moment generating function of is
ii. Find the constant c such that (that's "c over Y bar", if it does not show up properly on the screen) is an unbiased estimator for , where (X bar) is the sample mean. Calculate the vairance of this estimator and compare it with the Cramer-Rao lower bound.
(This is an exact wording from the book, including X bar which appears out of nowhere)
Answer.
(i) since Ys are iid.
(ii) I need to find , and I don't think I can simply substitute the mean in the denominator (expected value of the 1 over Y bar) here, so what else can I do?
I could also try . Any pointers?
For the second question, all I can see is that you notice, from the mgf, that the sum follows a Gamma distribution , hence you have its density ( )
So
I didn't do the calculations, but this integral can be solved by retrieving the pdf of another gamma (which integral is 1)
Thanks - let me try.
Can I ask what you use the tilde over y for here? as in . To show that it is a distribution function?
Now then,
let (to distinguish it from Y variable)
then per (a), and we conclude that as Moo has pointed out and I should have seen myself ))) (probably it is a good idea to use part (a) in working out part (b))
and I will use the distribution of U to find the expected value of U:
since the integral is a Gamma distribution integral for and is equal to 1.
Therefore
and
, and the unbiased estimator is
I now need to find variance of that unbiased estimator
Here is how I came up with
Then finally the variance of the unbiased estimator
Not sure if it is OK to have theta in the formula?
Next, I must figure out Cramer-Rao lower bound for variance...
The tilde was here because I name the rv Y. So when I saw that there could be a confusion with the very first pdf you defined, I chose to call the pdf differently instead of calling the rv differently
Your calculations for finding c are correct
It's ok for the variance too, and it's not a problem if there is . You can simplify from the beginning the n²... It's too bad you want to keep it 'til the end
For Cramér-Rao's bound, you have to compute Fisher's information, right ? So just try it ^^ It's not that difficult.
FYI :
I am sorry, I was just building up to it - I am terribly insecure when it comes to Cramer-Rao
Log-likelihood
Information
, for a single observation
Then for a random sample of iid
, as expected.
I think I am done here. By the way, how do you place a 'hat' on the top of the variable in Latex code?