Results 1 to 2 of 2

Thread: The expected value of this estimator, involving the minimum statistic of an exp dstrn

  1. #1
    Newbie
    Joined
    Aug 2017
    From
    United States
    Posts
    1

    The expected value of this estimator, involving the minimum statistic of an exp dstrn

    Suppose that Y1, Y2, .... Yn ~exp(B). Define two estimators for B as:

    B1 = nY(1) ; B2 = 1/n Summation from i-n of Y, or Y-bar.

    I'm trying to determine that the first estimator is unbiased, but I've returned expected values of it as n3B.

    Y(1) =n[1 -Fy]n-1fy,
    = n (e-y/B)n-1(1/B)(e-y/B)
    = n/B (e-y/B)n
    = (n2/B)(e-y/B)

    E[B1] = E[n* (n2/B)(e-y/B)] = n3E[(1/B)(e-y/B)] = n3B.

    Obviously this is incorrect. Where am I misunderstanding something? I've gone over the minimum order statistic function like ten times and the book we are using just glosses over the expectation of it.

    Edit: I've noticed at least one mistake, putting (e^(-y/B))^n as n(e^(-y/B)) is not correct algebra. but, I'm still stuck.
    Last edited by notamathmiz; Aug 5th 2017 at 02:08 PM.
    Follow Math Help Forum on Facebook and Google+

  2. #2
    Senior Member
    Joined
    Mar 2008
    From
    Pennsylvania, USA
    Posts
    337
    Thanks
    45

    Re: The expected value of this estimator, involving the minimum statistic of an exp d

    Note that I am assuming that the Y_is are independent!!

    First, find the CDF of $Y_{(1)}$:

    \begin{align*}F_{Y_{(1)}}(y) &= P(Y_{(1)}\leq y) \\ &= 1 - P(Y_{(1)}> y) \\ &= 1-P\left([Y_1>y] \cap [Y_2>y] \cap \ldots \cap [Y_n>y]\right) \\  &= 1-\left[P(Y_1>y)\cdot P(Y_2>y)\cdots P(Y_n>y)\right] \\ &= 1 - \left[(e^{-\tfrac{y}{\beta}}) \cdot (e^{-\tfrac{y}{\beta}}) \cdots (e^{-\tfrac{y}{\beta}})\right] \\ &= 1 - e^{-\left(\tfrac{n}{\beta}\right)y}    \end{align*}

    The derivative of the CDF with respect to $y$ gives us the PDF:

    \begin{align*}f_{Y_{(1)}}(y) &= \frac{n}{\beta} \cdot e^{-\left(\tfrac{n}{\beta}\right)y}    \end{align*}

    We now compute $E[Y_{(1)}]$ using integration by parts.

    For neatness, we use \lambda=\tfrac{n}{\beta} for the integral. We will use $u=y, u^\prime=1, v=-e^{-\lambda y}, v^\prime=\lambda e^{-\lambda y}$.

    \begin{align*}E[Y_{(1)}] &= \int^\infty_0 y\cdot f_{Y_{(1)}}(y) dy \\ &= \int^\infty_0 y \lambda e^{-\lambda y} dy \\ &= \left[-ye^{-\lambda y} + \int e^{-\lambda y} dy  \right]^{\infty}_{0} \\ &= \frac{1}{\lambda}\\ &= \frac{1}{n/\beta}\\ &= \frac{\beta}{n}  \end{align*}

    Got it from here?

    Best,
    Andy
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. Expected value of an order statistic
    Posted in the Advanced Statistics Forum
    Replies: 1
    Last Post: Jun 23rd 2014, 02:10 PM
  2. Replies: 1
    Last Post: Dec 6th 2013, 07:15 AM
  3. Expected Value of Bootstrap Standard Error Estimator
    Posted in the Advanced Statistics Forum
    Replies: 0
    Last Post: May 21st 2013, 07:32 PM
  4. Replies: 3
    Last Post: Apr 1st 2011, 12:17 AM
  5. Minimum Variance Unbiased Estimator
    Posted in the Advanced Statistics Forum
    Replies: 6
    Last Post: Aug 24th 2009, 07:38 AM

/mathhelpforum @mathhelpforum