I need your help...Please read the document that I attached here...thanks!
To save others the trouble of opening the attachment:
1. Let $\displaystyle x_1, \, x_2, \, .... , \, x_n$ be a sample from a Bernoulli distribution with parameter p.
$\displaystyle P[X = x] = p^x (1 - p)^{1-x}, \, I_{(0, 1)}(x)$
a. Derive the method of moments estimator of p.
b. Verify if your method of moments estimator of p is unbiased for p.
2. Let $\displaystyle x_1, \, x_2, \, .... , \, x_n$ be a sample from a Gamma distribution with parameter $\displaystyle \alpha$ and $\displaystyle \beta$.
$\displaystyle f(x) = \frac{x^{\alpha - 1} e^{-x/\beta}}{\Gamma(\alpha) \beta^{\alpha}}, \, x > 0, \, \beta > 0$
$\displaystyle f(x) = 0, ~ x \leq 0$
a. If $\displaystyle \beta$ is known, derive the method of moments estimator of $\displaystyle \alpha$.
b. Verify if your method of moments estimator of $\displaystyle \alpha$ is unbiased for $\displaystyle \alpha$.
First read these threads:
http://www.mathhelpforum.com/math-he...tion-help.html (posts #1, #2)
http://www.mathhelpforum.com/math-he...tatistics.html (posts #1, #2)
http://www.mathhelpforum.com/math-he...estimator.html
http://www.mathhelpforum.com/math-he...estimator.html
http://www.mathhelpforum.com/math-he...estimator.html
1. a. $\displaystyle E(X) = p$.
Sample mean $\displaystyle = \frac{x_1 + x_2 + \, .... + x_n}{n}$.
So use $\displaystyle p = \hat{p} = \frac{x_1 + x_2 + \, .... + x_n}{n}$ as the estimator.
1. b. Show whether or not $\displaystyle E(\hat{p}) = p$.
-----------------------------------------------------------------------------------------
2. a. $\displaystyle E(X) = \alpha \, \beta \Rightarrow \alpha = \frac{E(X)}{\beta}$.
Sample mean $\displaystyle = \frac{x_1 + x_2 + \, .... + x_n}{n}$.
So use $\displaystyle \alpha = \hat{\alpha} = \frac{x_1 + x_2 + \, .... + x_n}{n \, \beta}$ as the estimator.
2. b. Show whether or not $\displaystyle E(\hat{\alpha}) = \alpha$.