Results 1 to 3 of 3

Thread: Consistent and unbiased estimator.

  1. #1
    Senior Member
    Joined
    Jan 2008
    From
    Montreal
    Posts
    311
    Awards
    1

    Consistent and unbiased estimator.

    Let $\displaystyle Y_1, Y_2 , . . . \dotso Y_n$ denote a random sample from the uniform distribution on the interval $\displaystyle (\theta,\ \theta+1)$. Let

    $\displaystyle \hat{\theta_1} = \overline{Y}-\frac{1}{2}$ and $\displaystyle \hat{\theta_2} = Y_{(n)} -\frac{n}{n+1}$

    a) show that both $\displaystyle \hat{\theta_1}$ and $\displaystyle \hat{\theta_2}$ are unbiased estimators for $\displaystyle \theta$

    b) show that both $\displaystyle \hat{\theta_1}$ and $\displaystyle \hat{\theta_2}$ are consistent estimators for $\displaystyle \theta$

    Attempt

    a) I figure $\displaystyle f(y) = \left\{ \begin{array}{rcl}
    \frac{1}{(\theta+1)-\theta} & \mbox{for} & \theta \leq y \leq \theta+1 \\ 0 & \mbox{otherwise} &
    \end{array}\right. $ simplifying I get:

    $\displaystyle f(y) = \left\{ \begin{array}{rcl}
    1 & \mbox{for} & \theta \leq y \leq \theta+1 \\ 0 & \mbox{otherwise} &
    \end{array}\right. $

    so for the estimator of $\displaystyle \hat{\theta_1}$

    $\displaystyle \int_\theta^{\theta+1} y \ dy$

    $\displaystyle = \frac{y^2}{2}\bigg{|}^{\theta+1}_{\theta}$

    $\displaystyle = \frac{1}{2}\bigg{(}(\theta+1)^2-\theta^2\bigg{)}$

    $\displaystyle = \frac{1}{2}(2\theta+1) = \theta +\frac{1}{2}$ I know I'm near the end for $\displaystyle \hat{\theta_1}$ but I don't know how to get it into to above form.

    for $\displaystyle \hat{\theta_2}$ which is an order stat. my $\displaystyle F(y) = y$, thus for the max function which is $\displaystyle n[F(y)]^{n-1}f(y)$, so getting the estimator:

    $\displaystyle \int_\theta^{\theta+1} n[y]^{n-1} \cdot 1 \cdot y \ dy$

    $\displaystyle =\int_\theta^{\theta+1} n[y]^n \ dy$

    $\displaystyle =y^{n+1}\frac{n}{n+1} \bigg{|}_\theta^{\theta+1}$

    $\displaystyle =\frac{n}{n+1} \bigg{(}(\theta+1)^{n+1} - \theta^{n+1}\bigg{)}$ at this point I get stuck.


    b) Looking at the definition $\displaystyle \lim_{n\rightarrow \infty} \ V(\hat{\theta}) = 0 $

    so for $\displaystyle \hat{\theta_1}$

    $\displaystyle E[\hat{\theta_1^2}]= \int_\theta^{\theta+1} y^2 \ dy$

    $\displaystyle = \frac{y^3}{3}\bigg{|}^{\theta+1}_{\theta}$

    $\displaystyle = \frac{1}{3}\bigg{(}(\theta+1)^3-\theta^3\bigg{)}$

    $\displaystyle =\theta^2+\theta+\frac{1}{3}$

    thus $\displaystyle V(\theta)= \theta^2+\theta+\frac{1}{3} - (\theta +\frac{1}{2})^2= \frac{1}{12}$

    this next step I'm not sure, but it looks similar to the example I have in the book $\displaystyle V(\hat{\theta_1})=V \left[ \overline{Y}-\frac{1}{2}\right] = V(\overline{Y})+\frac{1}{4}$ at which point I don't know how to proceed.

    for $\displaystyle \hat{\theta_2}$

    $\displaystyle E[\theta^2]=\int_\theta^{\theta+1} n[y]^{n-1} \cdot 1 \cdot y^2 \ dy$

    $\displaystyle E[\theta^2]=\int_\theta^{\theta+1} n[y]^{n+1} \ dy$

    $\displaystyle E[\theta^2]=y^{n+2}\frac{n}{n+2} \bigg{|}_\theta^{\theta+1}$

    $\displaystyle E[\theta^2]=\frac{n}{n+2} \bigg{(}(\theta+1)^{n+2} - \theta^{n+2}\bigg{)}$

    for the variance

    $\displaystyle V[\theta]= \frac{n}{n+2} \bigg{(}(\theta+1)^{n+2} - \theta^{n+2}\bigg{)} - \left( \frac{n}{n+1} \bigg{(}(\theta+1)^{n+1} - \theta^{n+1}\bigg{)} \right)^2$

    $\displaystyle V[\theta]= \frac{n}{n+2} \bigg{(}(\theta+1)^{n+2} - \theta^{n+2}\bigg{)} - \left( \frac{n}{n+1} \right)^2 \bigg{(}[(\theta+1)^{n+1}]^2 -2[\theta(\theta+1)]^{n+1} + [\theta^{n+1}]^2\bigg{)} $ I can't seem to go further from here.
    Follow Math Help Forum on Facebook and Google+

  2. #2
    Flow Master
    mr fantastic's Avatar
    Joined
    Dec 2007
    From
    Zeitgeist
    Posts
    16,948
    Thanks
    9
    Quote Originally Posted by lllll View Post
    Let $\displaystyle Y_1, Y_2 , . . . \dotso Y_n$ denote a random sample from the uniform distribution on the interval $\displaystyle (\theta,\ \theta+1)$. Let

    $\displaystyle \hat{\theta_1} = \overline{Y}-\frac{1}{2}$ and $\displaystyle \hat{\theta_2} = Y_{(n)} -\frac{n}{n+1}$

    a) show that both $\displaystyle \hat{\theta_1}$ and $\displaystyle \hat{\theta_2}$ are unbiased estimators for $\displaystyle \theta$

    [snip]

    Attempt

    a) I figure $\displaystyle f(y) = \left\{ \begin{array}{rcl}
    \frac{1}{(\theta+1)-\theta} & \mbox{for} & \theta \leq y \leq \theta+1 \\ 0 & \mbox{otherwise} &
    \end{array}\right. $ simplifying I get:

    $\displaystyle f(y) = \left\{ \begin{array}{rcl}
    1 & \mbox{for} & \theta \leq y \leq \theta+1 \\ 0 & \mbox{otherwise} &
    \end{array}\right. $

    so for the estimator of $\displaystyle \hat{\theta_1}$

    $\displaystyle \int_\theta^{\theta+1} y \ dy$

    $\displaystyle = \frac{y^2}{2}\bigg{|}^{\theta+1}_{\theta}$

    $\displaystyle = \frac{1}{2}\bigg{(}(\theta+1)^2-\theta^2\bigg{)}$

    $\displaystyle = \frac{1}{2}(2\theta+1) = \theta +\frac{1}{2}$ I know I'm near the end for $\displaystyle \hat{\theta_1}$ but I don't know how to get it into to above form.

    for $\displaystyle \hat{\theta_2}$ which is an order stat. my $\displaystyle F(y) = y$, thus for the max function which is $\displaystyle n[F(y)]^{n-1}f(y)$, so getting the estimator:

    $\displaystyle \int_\theta^{\theta+1} n[y]^{n-1} \cdot 1 \cdot y \ dy$

    $\displaystyle =\int_\theta^{\theta+1} n[y]^n \ dy$

    $\displaystyle =y^{n+1}\frac{n}{n+1} \bigg{|}_\theta^{\theta+1}$

    $\displaystyle =\frac{n}{n+1} \bigg{(}(\theta+1)^{n+1} - \theta^{n+1}\bigg{)}$ at this point I get stuck.

    [snip]
    $\displaystyle E(\hat{\theta_1}) = E\left(\overline{Y} - \frac{1}{2}\right)$

    $\displaystyle = E\left(\frac{Y_1 + Y_2 + \, .... \, + Y_n}{n} - \frac{1}{2}\right)$

    $\displaystyle = E\left(\frac{Y_1}{n}\right) + E\left(\frac{Y_2}{n}\right) + \, .... \, + E\left(\frac{Y_n}{n}\right) - E\left(\frac{1}{2}\right)$

    $\displaystyle = \frac{\theta + \frac{1}{2}}{n} + \frac{\theta + \frac{1}{2}}{n} + \, .... \, + \frac{\theta + \frac{1}{2}}{n} - \frac{1}{2}$

    $\displaystyle = n \, \left( \frac{\theta + \frac{1}{2}}{n}\right) - \frac{1}{2} = \theta$.

    --------------------------------------------------------------------------------

    $\displaystyle E(\hat{\theta_2}) = E\left(Y_{(n)} - \frac{n}{n+1}\right)$

    $\displaystyle = E\left(Y_{(n)} \right) - E\left(\frac{n}{n+1}\right)$

    $\displaystyle = E\left(Y_{(n)} \right) - \frac{n}{n+1}$.


    $\displaystyle g(u) = n [F(u)]^{n-1} f(u) = n (u - \theta)^{n-1} (1) = n (u - \theta)^{n-1}$.

    Note: Since the distribution of Y is uniform, no integration is necessary to get F(u).


    Therefore $\displaystyle E\left(Y_{(n)} \right) = \int_{\theta}^{\theta + 1} u \, g(u) \, du = n \int_{\theta}^{\theta + 1} u \, (u - \theta)^{n-1} \, du$

    Substitute $\displaystyle w = u - \theta$:

    $\displaystyle = n \int_{0}^{1} (w + \theta) \, w^{n-1} \, dw$

    $\displaystyle = n \int_{0}^{1} w^{n} \, dw + n \theta \int_{0}^{1} w^{n-1} \, dw$

    $\displaystyle = \frac{n}{n+1} + \theta$.
    Last edited by mr fantastic; Nov 18th 2008 at 03:51 AM. Reason: Replaced \bar with \overline
    Follow Math Help Forum on Facebook and Google+

  3. #3
    Flow Master
    mr fantastic's Avatar
    Joined
    Dec 2007
    From
    Zeitgeist
    Posts
    16,948
    Thanks
    9
    Quote Originally Posted by lllll View Post
    Let $\displaystyle Y_1, Y_2 , . . . \dotso Y_n$ denote a random sample from the uniform distribution on the interval $\displaystyle (\theta,\ \theta+1)$. Let

    $\displaystyle \hat{\theta_1} = \overline{Y}-\frac{1}{2}$ and $\displaystyle \hat{\theta_2} = Y_{(n)} -\frac{n}{n+1}$

    a) show that both $\displaystyle \hat{\theta_1}$ and $\displaystyle \hat{\theta_2}$ are unbiased estimators for $\displaystyle \theta$

    b) show that both $\displaystyle \hat{\theta_1}$ and $\displaystyle \hat{\theta_2}$ are consistent estimators for $\displaystyle \theta$

    [snip]

    b) Looking at the definition $\displaystyle \lim_{n\rightarrow \infty} \ V(\hat{\theta}) = 0 $

    so for $\displaystyle \hat{\theta_1}$

    $\displaystyle E[\hat{\theta_1^2}]= \int_\theta^{\theta+1} y^2 \ dy$

    $\displaystyle = \frac{y^3}{3}\bigg{|}^{\theta+1}_{\theta}$

    $\displaystyle = \frac{1}{3}\bigg{(}(\theta+1)^3-\theta^3\bigg{)}$

    $\displaystyle =\theta^2+\theta+\frac{1}{3}$

    thus $\displaystyle V(\theta)= \theta^2+\theta+\frac{1}{3} - (\theta +\frac{1}{2})^2= \frac{1}{12}$

    this next step I'm not sure, but it looks similar to the example I have in the book $\displaystyle V(\hat{\theta_1})=V \left[ \overline{Y}-\frac{1}{2}\right] = V(\overline{Y})+\frac{1}{4}$ Mr F says: That's not right. It's just $\displaystyle {\color{red}V\left(\overline{Y}\right)}$.

    at which point I don't know how to proceed.

    for $\displaystyle \hat{\theta_2}$

    $\displaystyle E[\theta^2]=\int_\theta^{\theta+1} n[y]^{n-1} \cdot 1 \cdot y^2 \ dy$

    $\displaystyle E[\theta^2]=\int_\theta^{\theta+1} n[y]^{n+1} \ dy$

    $\displaystyle E[\theta^2]=y^{n+2}\frac{n}{n+2} \bigg{|}_\theta^{\theta+1}$

    $\displaystyle E[\theta^2]=\frac{n}{n+2} \bigg{(}(\theta+1)^{n+2} - \theta^{n+2}\bigg{)}$

    for the variance

    $\displaystyle V[\theta]= \frac{n}{n+2} \bigg{(}(\theta+1)^{n+2} - \theta^{n+2}\bigg{)} - \left( \frac{n}{n+1} \bigg{(}(\theta+1)^{n+1} - \theta^{n+1}\bigg{)} \right)^2$

    $\displaystyle V[\theta]= \frac{n}{n+2} \bigg{(}(\theta+1)^{n+2} - \theta^{n+2}\bigg{)} - \left( \frac{n}{n+1} \right)^2 \bigg{(}[(\theta+1)^{n+1}]^2 -2[\theta(\theta+1)]^{n+1} + [\theta^{n+1}]^2\bigg{)} $ I can't seem to go further from here.
    Since $\displaystyle \hat{\theta_1}$ and $\displaystyle \hat{\theta_2}$ are unbiased estimators, it's sufficient to show $\displaystyle \lim_{n \rightarrow \infty} Var(\hat{\theta_1}) = 0$ and $\displaystyle \lim_{n \rightarrow \infty} Var(\hat{\theta_2}) = 0$.


    $\displaystyle Var(\hat{\theta_1}) = Var\left(\overline{Y} - \frac{1}{2}\right) = Var(\overline{Y}) = Var\left( \frac{Y_1 + Y_2 + \, .... \, + Y_n}{n}\right)$

    $\displaystyle = \frac{1}{n^2} \, Var\left( Y_1 + Y_2 + \, .... \, + Y_n \right) = \frac{n}{n^2} \, Var(Y_i) = \frac{1}{n} \, \left(\frac{1}{12}\right)$.

    ----------------------------------------------------------------------------------------------------------------------------

    $\displaystyle Var(\hat{\theta_2}) = Var\left(Y_{(n)} - \frac{n}{n+1}\right) = Var\left(Y_{(n)}\right)$ $\displaystyle = E(Y_{(n)}^2) - [E(Y_{(n)})]^2$.

    You've already got $\displaystyle E(Y_{(n)})$ from part (a).

    $\displaystyle E\left(Y_{(n)}^2 \right) = \int_{\theta}^{\theta + 1} u^2 \, g(u) \, du = n \int_{\theta}^{\theta + 1} u^2 \, (u - \theta)^{n-1} \, du = \, ....$

    Then $\displaystyle Var(\hat{\theta_2}) = \frac{n}{n+2} - \frac{n^2}{(n+1)^2} = \frac{2n+1}{n^2 + 2n + 1} - \frac{2}{n+2}$.
    Last edited by mr fantastic; Nov 18th 2008 at 03:49 AM.
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. Consistent Estimator
    Posted in the Advanced Statistics Forum
    Replies: 5
    Last Post: May 7th 2011, 10:01 PM
  2. how do i show this is a consistent estimator?
    Posted in the Advanced Statistics Forum
    Replies: 1
    Last Post: Dec 3rd 2010, 04:27 PM
  3. When a blue estimator is consistent?
    Posted in the Advanced Statistics Forum
    Replies: 0
    Last Post: Apr 10th 2010, 11:39 AM
  4. unbiased estimator
    Posted in the Advanced Statistics Forum
    Replies: 3
    Last Post: Dec 7th 2008, 12:41 AM
  5. Consistent Estimator
    Posted in the Advanced Statistics Forum
    Replies: 0
    Last Post: May 5th 2008, 01:34 PM

Search Tags


/mathhelpforum @mathhelpforum