1. ## Unbiased Estimators

Let Y_1, Y_2,...,Y_n be a random sample of size n from the pdf f_Y(y;theta)=(1/theta)*e^(-y/theta), y>0:

a) Let theta hat=n*Y_min. Is theta hat unbiased for theta?

For this question, I'm not sure how to generate the pdf for Y_min. Once I get that, I assume I multiply by n, multiply by the other pdf, and solve the integral.

b) Is theta hat=(1/n)summation(from i=1 to n)Y_i unbiased for theta?

2. Originally Posted by eigenvector11
Let Y_1, Y_2,...,Y_n be a random sample of size n from the pdf f_Y(y;theta)=(1/theta)*e^(-y/theta), y>0:

a) Let theta hat=n*Y_min. Is theta hat unbiased for theta?

For this question, I'm not sure how to generate the pdf for Y_min. Once I get that, I assume I multiply by n, multiply by the other pdf, and solve the integral.

b) Is theta hat=(1/n)summation(from i=1 to n)Y_i unbiased for theta?
(a) If the random variable Y has pdf f(y) then the pdf of $Y_{(1)}= \text{min} \{ Y_1, \, Y_2, \, .... \, Y_n\}$ is found as follows:

The cdf of $Y_{(1)}$ is $G(y) = \Pr(Y_{(1)} \leq y) = 1 - \Pr(Y_{(1)} > y)$.

Since $Y_{(1)}$ is the minimum of $Y_1, \, Y_2, \, .... \, Y_n$ it follows that the event $\Pr(Y_{(1)} > y)$ occurs if and only if the events $\Pr(Y_i > y)$ occur for $i = 1, 2, \, .... \, n$. Since the $Y_i$ are independent and $\Pr(Y_i > y) = 1 - F(y)$ it follows that

$G(y) = \Pr(Y_{(1)} \leq y) = 1 - \Pr(Y_{(1)} > y) = 1 - \Pr(Y_1 > y, \, Y_2 > y, \, .... \, Y_n > y)$

$= 1 - \Pr(Y_1 > y) \cdot \Pr(Y_2 > y) \cdot \, .... \, \cdot \Pr(Y_n > y) = 1 - [1 - F(y)]^n$.

The pdf of $Y_{(1)}$ is given by $g(y) = \frac{dG}{dy}$: $g(y) = n [1 - F(y)]^{n-1} f(y)$.

Now you have to calculate $E(n Y_{(1)})$ and see if it's equal to $\theta$.
--------------------------------------------------------------------------------

(b) Calculate the expected value of the estimator and see whether or not you get $\theta$.