Results 1 to 3 of 3

Math Help - Consistent and unbiased estimator.

  1. #1
    Senior Member
    Joined
    Jan 2008
    From
    Montreal
    Posts
    311
    Awards
    1

    Consistent and unbiased estimator.

    Let Y_1, Y_2 , . . . \dotso Y_n denote a random sample from the uniform distribution on the interval (\theta,\ \theta+1). Let

    \hat{\theta_1} = \overline{Y}-\frac{1}{2} and \hat{\theta_2} = Y_{(n)} -\frac{n}{n+1}

    a) show that both \hat{\theta_1} and \hat{\theta_2} are unbiased estimators for \theta

    b) show that both \hat{\theta_1} and \hat{\theta_2} are consistent estimators for \theta

    Attempt

    a) I figure f(y) = \left\{ \begin{array}{rcl}<br />
\frac{1}{(\theta+1)-\theta} & \mbox{for} & \theta \leq y \leq \theta+1 \\ 0 & \mbox{otherwise} & <br />
\end{array}\right. simplifying I get:

    f(y) = \left\{ \begin{array}{rcl}<br />
1 & \mbox{for} & \theta \leq y \leq \theta+1 \\ 0 & \mbox{otherwise} & <br />
\end{array}\right.

    so for the estimator of \hat{\theta_1}

    \int_\theta^{\theta+1} y \ dy

    = \frac{y^2}{2}\bigg{|}^{\theta+1}_{\theta}

    = \frac{1}{2}\bigg{(}(\theta+1)^2-\theta^2\bigg{)}

    = \frac{1}{2}(2\theta+1) = \theta +\frac{1}{2} I know I'm near the end for \hat{\theta_1} but I don't know how to get it into to above form.

    for \hat{\theta_2} which is an order stat. my F(y) = y, thus for the max function which is n[F(y)]^{n-1}f(y), so getting the estimator:

    \int_\theta^{\theta+1} n[y]^{n-1} \cdot 1 \cdot y \ dy

    =\int_\theta^{\theta+1} n[y]^n \ dy

    =y^{n+1}\frac{n}{n+1} \bigg{|}_\theta^{\theta+1}

    =\frac{n}{n+1} \bigg{(}(\theta+1)^{n+1} - \theta^{n+1}\bigg{)} at this point I get stuck.


    b) Looking at the definition \lim_{n\rightarrow \infty} \ V(\hat{\theta}) = 0

    so for \hat{\theta_1}

    E[\hat{\theta_1^2}]= \int_\theta^{\theta+1} y^2 \ dy

    = \frac{y^3}{3}\bigg{|}^{\theta+1}_{\theta}

    = \frac{1}{3}\bigg{(}(\theta+1)^3-\theta^3\bigg{)}

    =\theta^2+\theta+\frac{1}{3}

    thus V(\theta)= \theta^2+\theta+\frac{1}{3} - (\theta +\frac{1}{2})^2= \frac{1}{12}

    this next step I'm not sure, but it looks similar to the example I have in the book V(\hat{\theta_1})=V \left[ \overline{Y}-\frac{1}{2}\right] = V(\overline{Y})+\frac{1}{4} at which point I don't know how to proceed.

    for \hat{\theta_2}

    E[\theta^2]=\int_\theta^{\theta+1} n[y]^{n-1} \cdot 1 \cdot y^2 \ dy

    E[\theta^2]=\int_\theta^{\theta+1} n[y]^{n+1} \ dy

    E[\theta^2]=y^{n+2}\frac{n}{n+2} \bigg{|}_\theta^{\theta+1}

    E[\theta^2]=\frac{n}{n+2} \bigg{(}(\theta+1)^{n+2} - \theta^{n+2}\bigg{)}

    for the variance

    V[\theta]= \frac{n}{n+2} \bigg{(}(\theta+1)^{n+2} - \theta^{n+2}\bigg{)} - \left( \frac{n}{n+1} \bigg{(}(\theta+1)^{n+1} - \theta^{n+1}\bigg{)} \right)^2

    V[\theta]= \frac{n}{n+2} \bigg{(}(\theta+1)^{n+2} - \theta^{n+2}\bigg{)} - \left( \frac{n}{n+1} \right)^2 \bigg{(}[(\theta+1)^{n+1}]^2 -2[\theta(\theta+1)]^{n+1} + [\theta^{n+1}]^2\bigg{)} I can't seem to go further from here.
    Follow Math Help Forum on Facebook and Google+

  2. #2
    Flow Master
    mr fantastic's Avatar
    Joined
    Dec 2007
    From
    Zeitgeist
    Posts
    16,948
    Thanks
    5
    Quote Originally Posted by lllll View Post
    Let Y_1, Y_2 , . . . \dotso Y_n denote a random sample from the uniform distribution on the interval (\theta,\ \theta+1). Let

    \hat{\theta_1} = \overline{Y}-\frac{1}{2} and \hat{\theta_2} = Y_{(n)} -\frac{n}{n+1}

    a) show that both \hat{\theta_1} and \hat{\theta_2} are unbiased estimators for \theta

    [snip]

    Attempt

    a) I figure f(y) = \left\{ \begin{array}{rcl}<br />
\frac{1}{(\theta+1)-\theta} & \mbox{for} & \theta \leq y \leq \theta+1 \\ 0 & \mbox{otherwise} & <br />
\end{array}\right. simplifying I get:

    f(y) = \left\{ \begin{array}{rcl}<br />
1 & \mbox{for} & \theta \leq y \leq \theta+1 \\ 0 & \mbox{otherwise} & <br />
\end{array}\right.

    so for the estimator of \hat{\theta_1}

    \int_\theta^{\theta+1} y \ dy

    = \frac{y^2}{2}\bigg{|}^{\theta+1}_{\theta}

    = \frac{1}{2}\bigg{(}(\theta+1)^2-\theta^2\bigg{)}

    = \frac{1}{2}(2\theta+1) = \theta +\frac{1}{2} I know I'm near the end for \hat{\theta_1} but I don't know how to get it into to above form.

    for \hat{\theta_2} which is an order stat. my F(y) = y, thus for the max function which is n[F(y)]^{n-1}f(y), so getting the estimator:

    \int_\theta^{\theta+1} n[y]^{n-1} \cdot 1 \cdot y \ dy

    =\int_\theta^{\theta+1} n[y]^n \ dy

    =y^{n+1}\frac{n}{n+1} \bigg{|}_\theta^{\theta+1}

    =\frac{n}{n+1} \bigg{(}(\theta+1)^{n+1} - \theta^{n+1}\bigg{)} at this point I get stuck.

    [snip]
    E(\hat{\theta_1}) = E\left(\overline{Y} - \frac{1}{2}\right)

    = E\left(\frac{Y_1 + Y_2 + \, .... \, + Y_n}{n} - \frac{1}{2}\right)

    = E\left(\frac{Y_1}{n}\right) + E\left(\frac{Y_2}{n}\right) + \, .... \, + E\left(\frac{Y_n}{n}\right) - E\left(\frac{1}{2}\right)

    = \frac{\theta + \frac{1}{2}}{n} + \frac{\theta + \frac{1}{2}}{n} + \, .... \, + \frac{\theta + \frac{1}{2}}{n} - \frac{1}{2}

    = n \, \left( \frac{\theta + \frac{1}{2}}{n}\right) - \frac{1}{2} = \theta.

    --------------------------------------------------------------------------------

    E(\hat{\theta_2}) = E\left(Y_{(n)} - \frac{n}{n+1}\right)

    = E\left(Y_{(n)} \right) - E\left(\frac{n}{n+1}\right)

    = E\left(Y_{(n)} \right) - \frac{n}{n+1}.


    g(u) = n [F(u)]^{n-1} f(u) = n (u - \theta)^{n-1} (1) = n (u - \theta)^{n-1}.

    Note: Since the distribution of Y is uniform, no integration is necessary to get F(u).


    Therefore E\left(Y_{(n)} \right) = \int_{\theta}^{\theta + 1} u \, g(u) \, du = n \int_{\theta}^{\theta + 1} u \, (u - \theta)^{n-1} \, du

    Substitute w = u - \theta:

    = n \int_{0}^{1} (w + \theta) \, w^{n-1} \, dw

    = n \int_{0}^{1} w^{n} \, dw + n \theta \int_{0}^{1} w^{n-1} \, dw

    = \frac{n}{n+1} + \theta.
    Last edited by mr fantastic; November 18th 2008 at 03:51 AM. Reason: Replaced \bar with \overline
    Follow Math Help Forum on Facebook and Google+

  3. #3
    Flow Master
    mr fantastic's Avatar
    Joined
    Dec 2007
    From
    Zeitgeist
    Posts
    16,948
    Thanks
    5
    Quote Originally Posted by lllll View Post
    Let Y_1, Y_2 , . . . \dotso Y_n denote a random sample from the uniform distribution on the interval (\theta,\ \theta+1). Let

    \hat{\theta_1} = \overline{Y}-\frac{1}{2} and \hat{\theta_2} = Y_{(n)} -\frac{n}{n+1}

    a) show that both \hat{\theta_1} and \hat{\theta_2} are unbiased estimators for \theta

    b) show that both \hat{\theta_1} and \hat{\theta_2} are consistent estimators for \theta

    [snip]

    b) Looking at the definition \lim_{n\rightarrow \infty} \ V(\hat{\theta}) = 0

    so for \hat{\theta_1}

    E[\hat{\theta_1^2}]= \int_\theta^{\theta+1} y^2 \ dy

    = \frac{y^3}{3}\bigg{|}^{\theta+1}_{\theta}

    = \frac{1}{3}\bigg{(}(\theta+1)^3-\theta^3\bigg{)}

    =\theta^2+\theta+\frac{1}{3}

    thus V(\theta)= \theta^2+\theta+\frac{1}{3} - (\theta +\frac{1}{2})^2= \frac{1}{12}

    this next step I'm not sure, but it looks similar to the example I have in the book V(\hat{\theta_1})=V \left[ \overline{Y}-\frac{1}{2}\right] = V(\overline{Y})+\frac{1}{4} Mr F says: That's not right. It's just {\color{red}V\left(\overline{Y}\right)}.

    at which point I don't know how to proceed.

    for \hat{\theta_2}

    E[\theta^2]=\int_\theta^{\theta+1} n[y]^{n-1} \cdot 1 \cdot y^2 \ dy

    E[\theta^2]=\int_\theta^{\theta+1} n[y]^{n+1} \ dy

    E[\theta^2]=y^{n+2}\frac{n}{n+2} \bigg{|}_\theta^{\theta+1}

    E[\theta^2]=\frac{n}{n+2} \bigg{(}(\theta+1)^{n+2} - \theta^{n+2}\bigg{)}

    for the variance

    V[\theta]= \frac{n}{n+2} \bigg{(}(\theta+1)^{n+2} - \theta^{n+2}\bigg{)} - \left( \frac{n}{n+1} \bigg{(}(\theta+1)^{n+1} - \theta^{n+1}\bigg{)} \right)^2

    V[\theta]= \frac{n}{n+2} \bigg{(}(\theta+1)^{n+2} - \theta^{n+2}\bigg{)} - \left( \frac{n}{n+1} \right)^2 \bigg{(}[(\theta+1)^{n+1}]^2 -2[\theta(\theta+1)]^{n+1} + [\theta^{n+1}]^2\bigg{)} I can't seem to go further from here.
    Since \hat{\theta_1} and \hat{\theta_2} are unbiased estimators, it's sufficient to show \lim_{n \rightarrow \infty} Var(\hat{\theta_1}) = 0 and \lim_{n \rightarrow \infty} Var(\hat{\theta_2}) = 0.


    Var(\hat{\theta_1}) = Var\left(\overline{Y} - \frac{1}{2}\right) = Var(\overline{Y}) = Var\left( \frac{Y_1 + Y_2 + \, .... \, + Y_n}{n}\right)

    = \frac{1}{n^2} \, Var\left( Y_1 + Y_2 + \, .... \, + Y_n \right) = \frac{n}{n^2} \, Var(Y_i) = \frac{1}{n} \, \left(\frac{1}{12}\right).

    ----------------------------------------------------------------------------------------------------------------------------

    Var(\hat{\theta_2}) = Var\left(Y_{(n)} - \frac{n}{n+1}\right) = Var\left(Y_{(n)}\right)  = E(Y_{(n)}^2) - [E(Y_{(n)})]^2.

    You've already got E(Y_{(n)}) from part (a).

    E\left(Y_{(n)}^2 \right) = \int_{\theta}^{\theta + 1} u^2 \, g(u) \, du = n \int_{\theta}^{\theta + 1} u^2 \, (u - \theta)^{n-1} \, du = \, ....

    Then Var(\hat{\theta_2}) = \frac{n}{n+2} - \frac{n^2}{(n+1)^2} = \frac{2n+1}{n^2 + 2n + 1} - \frac{2}{n+2}.
    Last edited by mr fantastic; November 18th 2008 at 03:49 AM.
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. Consistent Estimator
    Posted in the Advanced Statistics Forum
    Replies: 5
    Last Post: May 7th 2011, 10:01 PM
  2. how do i show this is a consistent estimator?
    Posted in the Advanced Statistics Forum
    Replies: 1
    Last Post: December 3rd 2010, 04:27 PM
  3. When a blue estimator is consistent?
    Posted in the Advanced Statistics Forum
    Replies: 0
    Last Post: April 10th 2010, 11:39 AM
  4. unbiased estimator
    Posted in the Advanced Statistics Forum
    Replies: 3
    Last Post: December 7th 2008, 12:41 AM
  5. Consistent Estimator
    Posted in the Advanced Statistics Forum
    Replies: 0
    Last Post: May 5th 2008, 01:34 PM

Search Tags


/mathhelpforum @mathhelpforum