# Thread: Order Stat and unbiased estimator

1. ## Order Stat and unbiased estimator

$\displaystyle f(y) = \left\{ \begin{array}{rcl} \alpha y^{\alpha-1}/\theta^{\alpha} & \mbox{if} & 0 \leq y \leq \theta \\ 0 & \mbox{if} & \mbox{otherwise} \end{array}\right.$

$\displaystyle \alpha > 0$, which is a fixed known value, and $\displaystyle \theta$ is unknown. Consider the estimator $\displaystyle \hat{\theta} =\max(Y_1, Y_2,..., Y_n)$

a) show that $\displaystyle \hat{\theta}$ is a biased estimator for $\displaystyle \theta$.
b) Find a multiple of $\displaystyle \hat{\theta}$ that is an unbiased estimator of $\displaystyle \theta$.

Solutions:

a)$\displaystyle E[Y] = \int_0^{\theta} \frac{\alpha y^{\alpha -1}}{\theta^{\alpha}} \cdot y \ dy$

$\displaystyle =\frac{\alpha}{\theta^{\alpha}} \int_0^{\theta} y^{\alpha} / dy$

$\displaystyle =\frac{\alpha}{\theta^{\alpha}} \times \frac{y^{\alpha+1}}{\alpha+1} \bigg{|}^{\theta}_0$

$\displaystyle =\frac{\alpha}{\theta^{\alpha}} \times \frac{\theta^{\alpha+1}}{\alpha+1}= \frac{\alpha \cdot \theta}{\alpha+1}$

now for the $\displaystyle \hat{\theta}$

$\displaystyle F(y) = \int_0^{\theta} \frac{\alpha y^{\alpha -1}}{\theta^{\alpha}} \ dy$

$\displaystyle F(y) = \frac{\alpha} {\theta^{\alpha}} \int_0^{\theta} y^{\alpha -1} \ dy$

$\displaystyle F(y) = \frac{\alpha} {\theta^{\alpha}} \cdot \frac{y^{\alpha}}{\alpha}\bigg{|}^{\theta}_{0} =\frac{\alpha} {\theta^{\alpha}} \cdot \frac{\theta^{\alpha}}{\alpha} =1$

$\displaystyle f_{(n)}(y) = n[F(y)]^{n-1}f(y)$

$\displaystyle f_{(n)}(y) = n[1]^{n-1} \times \alpha y^{\alpha-1}/\theta^{\alpha}$

$\displaystyle E[Y] = \int^{\theta}_0 n \times \alpha y^{\alpha-1}/\theta^{\alpha} \times y \ dy$

$\displaystyle = \frac{n\alpha}{\theta^{\alpha}} \int^{\theta}_0 y^{\alpha} \ dy$

$\displaystyle = \frac{n\alpha}{\theta^{\alpha}} \times \frac{y^{\alpha+1}}{\alpha+1} \bigg{|}^{\theta}_0$

$\displaystyle = \frac{n\alpha}{\theta^{\alpha}} \times \frac{\theta^{\alpha+1}}{\alpha+1} = \frac{n\alpha \theta}{\alpha+1}$

now $\displaystyle \frac{n\alpha \theta}{\alpha+1} \neq \frac{\alpha \theta}{\alpha+1}$ therefore it's biased.

b) $\displaystyle n =1$ for it to be an unbiased estimator.

is this correct?

2. Cumulative distribution for y:

$\displaystyle F(y) =\begin{cases} 0& y \le 0 \\ \int_0^{y} \frac{\alpha \xi^{\alpha -1}}{\theta^{\alpha}} \ d \xi & 0<y<\theta \\ 1 & y\ge \theta \end{cases}$

(you will find it advantageous if you describe what you are trying to do in words before embarking on the symbolic manipulations)

CB

3. Originally Posted by lllll
$\displaystyle f(y) = \left\{ \begin{array}{rcl} \alpha y^{\alpha-1}/\theta^{\alpha} & \mbox{if} & 0 \leq y \leq \theta \\ 0 & \mbox{if} & \mbox{otherwise} \end{array}\right.$

$\displaystyle \alpha > 0$, which is a fixed known value, and $\displaystyle \theta$ is unknown. Consider the estimator $\displaystyle \hat{\theta} =\max(Y_1, Y_2,..., Y_n)$

a) show that $\displaystyle \hat{\theta}$ is a biased estimator for $\displaystyle \theta$.
b) Find a multiple of $\displaystyle \hat{\theta}$ that is an unbiased estimator of $\displaystyle \theta$.

Solutions:

a)$\displaystyle E[Y] = \int_0^{\theta} \frac{\alpha y^{\alpha -1}}{\theta^{\alpha}} \cdot y \ dy$

$\displaystyle =\frac{\alpha}{\theta^{\alpha}} \int_0^{\theta} y^{\alpha} / dy$

$\displaystyle =\frac{\alpha}{\theta^{\alpha}} \times \frac{y^{\alpha+1}}{\alpha+1} \bigg{|}^{\theta}_0$

$\displaystyle =\frac{\alpha}{\theta^{\alpha}} \times \frac{\theta^{\alpha+1}}{\alpha+1}= \frac{\alpha \cdot \theta}{\alpha+1}$

Mr F says: The above calculation is not necessary for answering the question.

now for the $\displaystyle \hat{\theta}$

$\displaystyle F(y) = \int_0^{\theta} \frac{\alpha y^{\alpha -1}}{\theta^{\alpha}} \ dy$ Mr F says: All you will do here is prove that the area under the curve is equal to 1, ie. f(y) satisfies one of the two criteria for being a pdf ......

$\displaystyle F(y) = \frac{\alpha} {\theta^{\alpha}} \int_0^{\theta} y^{\alpha -1} \ dy$

$\displaystyle F(y) = \frac{\alpha} {\theta^{\alpha}} \cdot \frac{y^{\alpha}}{\alpha}\bigg{|}^{\theta}_{0} =\frac{\alpha} {\theta^{\alpha}} \cdot \frac{\theta^{\alpha}}{\alpha} =1$ Mr F say: An unsurprising result!

$\displaystyle f_{(n)}(y) = n[F(y)]^{n-1}f(y)$

$\displaystyle f_{(n)}(y) = n[1]^{n-1} \times \alpha y^{\alpha-1}/\theta^{\alpha}$

$\displaystyle E[Y] = \int^{\theta}_0 n \times \alpha y^{\alpha-1}/\theta^{\alpha} \times y \ dy$ Mr F says: Should be $\displaystyle {\color{red}E({\color{blue}\hat{\theta}})}$.

$\displaystyle = \frac{n\alpha}{\theta^{\alpha}} \int^{\theta}_0 y^{\alpha} \ dy$

$\displaystyle = \frac{n\alpha}{\theta^{\alpha}} \times \frac{y^{\alpha+1}}{\alpha+1} \bigg{|}^{\theta}_0$

$\displaystyle = \frac{n\alpha}{\theta^{\alpha}} \times \frac{\theta^{\alpha+1}}{\alpha+1} = \frac{n\alpha \theta}{\alpha+1}$

now $\displaystyle \frac{n\alpha \theta}{\alpha+1} \neq \frac{\alpha \theta}{\alpha+1}$ therefore it's biased.

b) $\displaystyle n =1$ for it to be an unbiased estimator.

is this correct?
Key results (details left for you to fill in):

$\displaystyle F(y) = \frac{y^{\alpha}}{\theta^{\alpha}}$.

$\displaystyle f_{(n)}(y) = \frac{n \alpha y^{n \alpha - 1}}{\theta^{n \alpha}}$.

$\displaystyle E(\hat{\theta}) = \int_{0}^{\theta} y \, \frac{n \alpha y^{n \alpha - 1}}{\theta^{n \alpha}} \, dy = \frac{n \alpha}{\theta^{n \alpha}} \, \int_{0}^{\theta} y^{n \alpha} \, dy = \frac{n \alpha}{n \alpha + 1} \, \theta \neq \theta$.

An unbiased estimator is therefore $\displaystyle \hat{\theta} = \left( \frac{n \alpha + 1}{\alpha} \right)\max(Y_1, Y_2,..., Y_n)$.

4. Originally Posted by mr fantastic
$\displaystyle E(\hat{\theta}) = \int_{0}^{\theta} y \, \frac{n \alpha y^{n \alpha - 1}}{\theta^{n \alpha}} \, dy = \frac{n \alpha}{\theta^{n \alpha}} \, \int_{0}^{\theta} y^{n \alpha} \, dy = \frac{n \alpha}{n \alpha + 1} \, \theta \neq \theta$.
When I integrate this out I get one. I get the stated answer for $\displaystyle E[\hat{\theta^2}]$

$\displaystyle E[\hat{\theta}] = \int_{0}^{\theta} \, n \left(\frac{y^{\alpha}}{\theta^{\alpha}} \right)^{n-1} \times \frac{\alpha y^{\alpha - 1}}{\theta^{\alpha}} \times y \ dy$

$\displaystyle E[\hat{\theta}] = \int_{0}^{\theta} \alpha n \frac{y^{\alpha n -\alpha}}{\theta^{\alpha n- \alpha}} \times \frac{y^{\alpha}}{\theta^{\alpha}} \ dy$

$\displaystyle E[\hat{\theta}] = \alpha n \int_{0}^{\theta} \frac{y^{\alpha n}}{\theta^{\alpha n}} \ dy$

$\displaystyle E[\hat{\theta}] = \frac{\alpha n}{\theta^{\alpha n}} \int_{0}^{\theta} y^{\alpha n} \ dy$

$\displaystyle E[\hat{\theta}] = \frac{\rlap{\color{red}\vrule height3.5pt depth-2pt width 1.2 em} \alpha n}{\theta^{\alpha n}} \frac{y^{\alpha n}}{\rlap{\color{red}\vrule height3.5pt depth-2pt width 1.2 em}\alpha n} \bigg{|}^{\theta}_0$

$\displaystyle E[\hat{\theta}] = \frac{\theta^{\alpha n}}{\theta^{\alpha n}} - \frac{0^{\alpha n}}{\theta^{\alpha n}}= 1$

now if I do $\displaystyle E[\hat{\theta^2}]$ I get:

$\displaystyle E[\hat{\theta^2}]= \int_{0}^{\theta} \, n \left(\frac{y^{\alpha}}{\theta^{\alpha}} \right)^{n-1} \times \frac{\alpha y^{\alpha - 1}}{\theta^{\alpha}} \times y^2 \ dy$

$\displaystyle E[\hat{\theta^2}] = \int_{0}^{\theta} \alpha n \frac{y^{\alpha n -\alpha}}{\theta^{\alpha n- \alpha}} \times \frac{y^{\alpha+1}}{\theta^{\alpha}} \ dy$

$\displaystyle E[\hat{\theta^2}] = \alpha n \int_{0}^{\theta} \frac{y^{\alpha n+1}}{\theta^{\alpha n}} \ dy$

$\displaystyle E[\hat{\theta^2}] = \frac{\alpha n}{\theta^{\alpha n}} \int_{0}^{\theta} y^{\alpha n+1} \ dy$

$\displaystyle E[\hat{\theta^2}] = \frac{\alpha n}{\theta^{\alpha n}} \frac{y^{\alpha n+1}}{n+1} \bigg{|}^{\theta}_0$

$\displaystyle E[\hat{\theta^2}] = \frac{\alpha n}{\theta^{\alpha n}} \frac{\theta^{\alpha n+1}}{n+1}-\frac{\alpha n}{\theta^{\alpha n}} \frac{0^{\alpha n+1}}{n+1} = \frac{\alpha n }{n+1} \theta$

so now for an unbiased estimator $\displaystyle E[\hat{\theta}] \neq \theta \longrightarrow 1=\theta$ since my $\displaystyle E[\hat{\theta}]$ is simply a constant in this case would it have to be equal to $\displaystyle \theta$?

5. Originally Posted by lllll
When I integrate this out I get one. I get the stated answer for $\displaystyle E[\hat{\theta^2}]$

$\displaystyle E[\hat{\theta}] = \int_{0}^{\theta} \, n \left(\frac{y^{\alpha}}{\theta^{\alpha}} \right)^{n-1} \times \frac{\alpha y^{\alpha - 1}}{\theta^{\alpha}} \times y \ dy$

$\displaystyle E[\hat{\theta}] = \int_{0}^{\theta} \alpha n \frac{y^{\alpha n -\alpha}}{\theta^{\alpha n- \alpha}} \times \frac{y^{\alpha}}{\theta^{\alpha}} \ dy$

$\displaystyle E[\hat{\theta}] = \alpha n \int_{0}^{\theta} \frac{y^{\alpha n}}{\theta^{\alpha n}} \ dy$

$\displaystyle E[\hat{\theta}] = \frac{\alpha n}{\theta^{\alpha n}} \int_{0}^{\theta} y^{\alpha n} \ dy$

$\displaystyle E[\hat{\theta}] = \frac{\rlap{\color{red}\vrule height3.5pt depth-2pt width 1.2 em} \alpha n}{\theta^{\alpha n}} \frac{y^{\alpha n}}{\rlap{\color{red}\vrule height3.5pt depth-2pt width 1.2 em}\alpha n} \bigg{|}^{\theta}_0$ Mr F says: This is wrong. It should be $\displaystyle {\color{red} \frac{\alpha n}{\theta^{\alpha n}} \left[ \frac{y^{\alpha n{\color{blue} + 1}}}{\alpha n{\color{blue} + 1}}\right]_{0}^{\theta}}$.

[snip]
..

6. Originally Posted by lllll
[snip]
now if I do $\displaystyle E[\hat{\theta^2}]$ I get:

$\displaystyle E[\hat{\theta^2}]= \int_{0}^{\theta} \, n \left(\frac{y^{\alpha}}{\theta^{\alpha}} \right)^{n-1} \times \frac{\alpha y^{\alpha - 1}}{\theta^{\alpha}} \times y^2 \ dy$

$\displaystyle E[\hat{\theta^2}] = \int_{0}^{\theta} \alpha n \frac{y^{\alpha n -\alpha}}{\theta^{\alpha n- \alpha}} \times \frac{y^{\alpha+1}}{\theta^{\alpha}} \ dy$

$\displaystyle E[\hat{\theta^2}] = \alpha n \int_{0}^{\theta} \frac{y^{\alpha n+1}}{\theta^{\alpha n}} \ dy$

$\displaystyle E[\hat{\theta^2}] = \frac{\alpha n}{\theta^{\alpha n}} \int_{0}^{\theta} y^{\alpha n+1} \ dy$

$\displaystyle E[\hat{\theta^2}] = \frac{\alpha n}{\theta^{\alpha n}} \frac{y^{\alpha n+1}}{n+1} \bigg{|}^{\theta}_0$ Mr F says: This is wrong. It should be $\displaystyle {\color{red}\frac{\alpha n}{\theta^{\alpha n}} ~ \frac{y^{\alpha n + {\color{blue}2}}}{{\color{blue}\alpha} n + {\color{blue}2}} \bigg{|}^{\theta}_{0}}$

[snip]
..