Results 1 to 12 of 12

Math Help - [SOLVED] Unbiased estimator

  1. #1
    Moo
    Moo is offline
    A Cute Angle Moo's Avatar
    Joined
    Mar 2008
    From
    P(I'm here)=1/3, P(I'm there)=t+1/3
    Posts
    5,618
    Thanks
    6

    [SOLVED] Unbiased estimator

    Hi,

    Just a quick question...

    Suppose we have (X_n) iid variables with a joint pdf h(x,\theta) (product of the pdf...), unknown parameter \theta

    We can write it in exponential form, and it's supposed regular.

    Then, we've been told that an estimator for \theta would be \hat{\theta} such that \frac{\partial \log(h(x,\hat{\theta}))}{\partial \theta}=0

    My question is : is \hat{\theta} is an unbiased estimator ?


    Another question... Why does \mathbb{E}\left(\frac{\partial \log h(X,\theta)}{\partial \theta}\right)=0 and the fact that \log(h(x,\theta)) is twice differentiable wrt \theta, for all x, imply that the likelihood is considered regular ?


    Are my notations correct ?



    Thanks in advance,
    Follow Math Help Forum on Facebook and Google+

  2. #2
    Moo
    Moo is offline
    A Cute Angle Moo's Avatar
    Joined
    Mar 2008
    From
    P(I'm here)=1/3, P(I'm there)=t+1/3
    Posts
    5,618
    Thanks
    6
    Oh, I saw in a Wikipedia article that we have

    \mathcal{I}(\theta)=\mathbb{E}\left[\left(\frac{\partial \log h(X,\theta)}{\partial \theta}\right)\right]=-\mathbb{E}\left[\frac{\partial^2 \log h(X,\theta)}{\partial\theta^2}\right]

    The fact that \mathbb{E}\left(\frac{\partial \log h(X,\theta)}{\partial \theta}\right)=0 would explain the first equality, but where does the second equality come from ?

    Actually, we've been told that if the likelihood is regular, then we can use the latter formula for \mathcal{I}(\theta)

    I'm pretty lost in all these things... I can apply the formulae, but I have problems understanding the origins...
    Follow Math Help Forum on Facebook and Google+

  3. #3
    MHF Contributor matheagle's Avatar
    Joined
    Feb 2009
    Posts
    2,763
    Thanks
    5
    I'm not exactly sure what you're asking.
    But it looks like you're missing a square on the Fischer Information...
    Fisher information - Wikipedia, the free encyclopedia
    The square is inside the expectation.
    Last edited by matheagle; May 25th 2009 at 07:23 PM.
    Follow Math Help Forum on Facebook and Google+

  4. #4
    MHF Contributor matheagle's Avatar
    Joined
    Feb 2009
    Posts
    2,763
    Thanks
    5
    It's....

    Quote Originally Posted by Moo View Post
    Oh, I saw in a Wikipedia article that we have

    \mathcal{I}(\theta)=\mathbb{E}\left[\left(\frac{\partial \log h(X,\theta)}{\partial \theta}\right)^2\right]=-\mathbb{E}\left[\frac{\partial^2 \log h(X,\theta)}{\partial\theta^2}\right]

    The fact that \mathbb{E}\left(\frac{\partial \log h(X,\theta)}{\partial \theta}\right)=0 would explain the first equality, but where does the second equality come from ?

    Actually, we've been told that if the likelihood is regular, then we can use the latter formula for \mathcal{I}(\theta)

    I'm pretty lost in all these things... I can apply the formulae, but I have problems understanding the origins...
    Follow Math Help Forum on Facebook and Google+

  5. #5
    Junior Member
    Joined
    Nov 2008
    Posts
    58
    Thanks
    1
    Hmm, I remember having derived this before. So let me try to make a go at it.
    First, if I remember correctly, you need the condition that(holds for any support of h)
    \int\frac{\partial^2}{\partial\theta^2} h(x, \theta)\mu(dx)=\frac{\partial^2}{\partial\theta^2}  \int h(x, \theta)\mu(dx)

    Then we note that
    \frac{\partial^2}{\partial\theta^2}\log h(x, \theta) = \left(\frac{\partial^2}{\partial\theta^2} h(x, \theta)/h(x, \theta)\right)-\left(\frac{\partial}{\partial\theta}\log h(x, \theta)\right)^2

    Take expectations on both sides, and we are almost there. Now apply the condition and we have
    \mathbb{E}\left(\frac{\partial^2}{\partial\theta^2  } h(x, \theta)/h(x, \theta)\right)=\int_{\Gamma}\frac{\partial^2}{\par  tial\theta^2} h(x, \theta)\mu(dx)=\frac{\partial^2}{\partial\theta^2}  \int_{\Gamma} h(x, \theta)\mu(dx)=\frac{\partial^2}{\partial\theta^2}  (1)=0
    Follow Math Help Forum on Facebook and Google+

  6. #6
    Moo
    Moo is offline
    A Cute Angle Moo's Avatar
    Joined
    Mar 2008
    From
    P(I'm here)=1/3, P(I'm there)=t+1/3
    Posts
    5,618
    Thanks
    6
    Quote Originally Posted by matheagle View Post
    I'm not exactly sure what you're asking.
    But it looks like you're missing a square on the Fischer Information...
    Fisher information - Wikipedia, the free encyclopedia
    The square is inside the expectation.
    That was just a typo ^^

    Quote Originally Posted by cl85 View Post
    Hmm, I remember having derived this before. So let me try to make a go at it.
    First, if I remember correctly, you need the condition that(holds for any support of h)
    \int\frac{\partial^2}{\partial\theta^2} h(x, \theta)\mu(dx)=\frac{\partial^2}{\partial\theta^2}  \int h(x, \theta)\mu(dx)

    Then we note that
    \frac{\partial^2}{\partial\theta^2}\log h(x, \theta) = \left(\frac{\partial^2}{\partial\theta^2} h(x, \theta)/h(x, \theta)\right)-\left(\frac{\partial}{\partial\theta}\log h(x, \theta)\right)^2
    Okay, til here, I understand.

    Take expectations on both sides, and we are almost there. Now apply the condition and we have
    \mathbb{E}\left(\frac{\partial^2}{\partial\theta^2  } h(x, \theta)/h(x, \theta)\right)=\int_{\Gamma}\frac{\partial^2}{\par  tial\theta^2} h(x, \theta)\mu(dx)=\frac{\partial^2}{\partial\theta^2}  \int_{\Gamma} h(x, \theta)\mu(dx)=\frac{\partial^2}{\partial\theta^2}  (1)=0
    But I don't understand this one...

    Shouldn't there be a log in the integral ?

    And what about \mathbb{E}\left\{\left(\frac{\partial}{\partial\th  eta}\log h(x, \theta)\right)^2\right\} ?


    Thanks anyway
    Follow Math Help Forum on Facebook and Google+

  7. #7
    MHF Contributor matheagle's Avatar
    Joined
    Feb 2009
    Posts
    2,763
    Thanks
    5
    This is correct...

    \frac{\partial^2}{\partial\theta^2}\log h(x, \theta) = \biggl(\frac{\partial^2}{\partial\theta^2} h(x, \theta)\biggr)/h(x, \theta)-\left(\frac{\partial}{\partial\theta}\log h(x, \theta)\right)^2


    When taking expectations, we are allowed to change the order of the partial derivative and the integral.
    That makes the second term zero and it's over.

    I'm still not sure what you are asking

    {d\over dx} \ln g(x) = {g'(x)\over g(x)}

    So \biggl({d\over dx} \ln g(x)\biggr)^2 = \biggl({g'(x)\over g(x)}\biggr)^2
    Follow Math Help Forum on Facebook and Google+

  8. #8
    Moo
    Moo is offline
    A Cute Angle Moo's Avatar
    Joined
    Mar 2008
    From
    P(I'm here)=1/3, P(I'm there)=t+1/3
    Posts
    5,618
    Thanks
    6
    Quote Originally Posted by matheagle View Post
    This is correct...

    \frac{\partial^2}{\partial\theta^2}\log h(x, \theta) = \biggl(\frac{\partial^2}{\partial\theta^2} h(x, \theta)\biggr)/h(x, \theta)-\left(\frac{\partial}{\partial\theta}\log h(x, \theta)\right)^2


    When taking expectations, we are allowed to change the order of the partial derivative and the integral.
    That makes the second term zero and it's over.
    What are you talking about ?
    When are you reversing the derivative & integral sign ? Why would that "make the second term zero" ???

    I'm still not sure what you are asking

    {d\over dx} \ln g(x) = {g'(x)\over g(x)}

    So \biggl({d\over dx} \ln g(x)\biggr)^2 = \biggl({g'(x)\over g(x)}\biggr)^2
    What is it supposed to show ?
    What don't you understand in what I'm asking actually ?
    Follow Math Help Forum on Facebook and Google+

  9. #9
    MHF Contributor matheagle's Avatar
    Joined
    Feb 2009
    Posts
    2,763
    Thanks
    5
    Good night.
    Last edited by matheagle; May 26th 2009 at 02:12 PM.
    Follow Math Help Forum on Facebook and Google+

  10. #10
    Junior Member
    Joined
    Nov 2008
    Posts
    58
    Thanks
    1
    Sorry, I was in a rush earlier. Let me put the brackets properly.

    \frac{\partial^2}{\partial\theta^2}\log h(x, \theta) = \frac{\left(\frac{\partial^2}{\partial\theta^2} h(x, \theta)\right)}{h(x, \theta)}-\left(\frac{\partial}{\partial\theta}\log h(x, \theta)\right)^2

    Rearrange the terms:
    \left(\frac{\partial}{\partial\theta}\log h(x, \theta)\right)^2=\frac{\left(\frac{\partial^2}{\pa  rtial\theta^2} h(x, \theta)\right)}{h(x, \theta)}-\frac{\partial^2}{\partial\theta^2}\log h(x, \theta)
    Take expectation on both sides:
    \mathbb{E}\left\{\left(\frac{\partial}{\partial\th  eta}\log h(x, \theta)\right)^2\right\}=\mathbb{E}\left\{\frac{\l  eft(\frac{\partial^2}{\partial\theta^2} h(x, \theta)\right)}{h(x, \theta)}\right\}-\mathbb{E}\left\{\frac{\partial^2}{\partial\theta^  2}\log h(x, \theta)\right\}

    And we almost have the desired equality, except that we need the first term on the RHS to be zero.
    \mathbb{E}\left\{\frac{\left(\frac{\partial^2}{\pa  rtial\theta^2} h(x, \theta)\right)}{h(x, \theta)}\right\}=\int_{\Gamma}\frac{\left(\frac{\p  artial^2}{\partial\theta^2} h(x, \theta)\right)}{h(x, \theta)} h(x, \theta)\mu(dx)
    =\frac{\partial^2}{\partial\theta^2}\int_{\Gamma} h(x, \theta)\mu(dx)=\frac{\partial^2}{\partial\theta^2}  (1)=0
    Follow Math Help Forum on Facebook and Google+

  11. #11
    Moo
    Moo is offline
    A Cute Angle Moo's Avatar
    Joined
    Mar 2008
    From
    P(I'm here)=1/3, P(I'm there)=t+1/3
    Posts
    5,618
    Thanks
    6
    Thank you

    Oh, there's still one question (the other one is not important) : is \hat\theta unbiased ?
    Follow Math Help Forum on Facebook and Google+

  12. #12
    Junior Member
    Joined
    Nov 2008
    Posts
    58
    Thanks
    1
    Isn't \hat\theta the maximum likelihood estimator(MLE)? MLEs are not always unbiased but they are asymptotically unbiased.
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. Unbiased estimator
    Posted in the Advanced Statistics Forum
    Replies: 2
    Last Post: May 27th 2011, 09:04 PM
  2. unbiased estimator of μ1 μ2
    Posted in the Advanced Statistics Forum
    Replies: 0
    Last Post: March 24th 2010, 12:45 AM
  3. Estimator Unbiased?
    Posted in the Advanced Statistics Forum
    Replies: 1
    Last Post: February 18th 2010, 04:58 PM
  4. Unbiased Estimator
    Posted in the Advanced Statistics Forum
    Replies: 4
    Last Post: December 2nd 2009, 12:51 AM
  5. unbiased estimator
    Posted in the Advanced Statistics Forum
    Replies: 3
    Last Post: December 7th 2008, 12:41 AM

Search Tags


/mathhelpforum @mathhelpforum