Results 1 to 7 of 7
Like Tree3Thanks
  • 1 Post By chiro
  • 1 Post By chiro
  • 1 Post By chiro

Math Help - MLE,bias for a single observation from N(0,theta)

  1. #1
    Junior Member
    Joined
    Jan 2011
    From
    Sydney
    Posts
    36

    MLE,bias for a single observation from N(0,theta)

    For a single observation x from N(0,theta), my MLE of theta is x^2 (using log-likelihood derivative)

    I now want to find out if this MLE (x^2) is biased and also want to find out the MLE for the standard deviation.

    Any help would be greatly appreciated.
    Follow Math Help Forum on Facebook and Google+

  2. #2
    MHF Contributor
    Joined
    Sep 2012
    From
    Australia
    Posts
    3,648
    Thanks
    601

    Re: MLE,bias for a single observation from N(0,theta)

    Hey liedora.

    Remember that something is biased if E[theta_hat - theta] = 0 where theta_hat is your estimator for theta (which is a distribution) and theta is the parameter you are estimating (which is a constant).

    Now E[theta] = theta (since theta is a constant), and you need to show that E[theta_hat] = theta for unbiased-ness and if this is not the case, you need to show that if you introduce some correction where theta_hat* = f(theta_hat) then E[theta_hat*] = theta.

    So since the MLE you have calculated is x^2, what is E[x^2]?
    Thanks from liedora
    Follow Math Help Forum on Facebook and Google+

  3. #3
    Junior Member
    Joined
    Jan 2011
    From
    Sydney
    Posts
    36

    Re: MLE,bias for a single observation from N(0,theta)

    Hey chiro thanks for your reply.

    I have actually gotten up to finding E[x^2], but not unsure of how to work this out. I figured that E[x] = mu = 0. Would it then be the case that Var[x] = E[x^2] - E[x]^2 implies that theta = E[x^2] - (0)^2 so E[x^2] = theta? So we conclude that x^2 is an unbiased estimator for theta?

    Not completely certain of my reasoning.

    Thanks in advance for the help
    Follow Math Help Forum on Facebook and Google+

  4. #4
    MHF Contributor
    Joined
    Sep 2012
    From
    Australia
    Posts
    3,648
    Thanks
    601

    Re: MLE,bias for a single observation from N(0,theta)

    You need to perform the calculation to confirm it, you can't just assume it.

    For a continuous distribution E[g(X)] = Integral from -infinity to infinity g(x)f(x)dx for a PDF f(x).

    Also with regards to the MLE, you need to consider how big your random sample is (i.e. your likelihood function will be a product of n independent likelihood functions corresponding to your sample which means you will have to account for this term).

    Remember that the point estimate is a statistic which is a function of your sample defined by your likelihood. This means that if you have the normal MLE point estimate for the variance it should be Sum of all squared values of your sample divided by n.

    Now you need to find the expectation of this estimator (remember: it is a function of random variables so it is a random variable) and in doing this you will get an expectation value and it will be biased (but you can correct it to be unbiased).

    I'll wait for you to re-check your calculation to estimate the variance and if you're stuck I can provide a few hints to get you started.
    Thanks from liedora
    Follow Math Help Forum on Facebook and Google+

  5. #5
    Junior Member
    Joined
    Jan 2011
    From
    Sydney
    Posts
    36

    Re: MLE,bias for a single observation from N(0,theta)

    Hi chiro

    Since we know x is an observation from N(0,theta) doesn't that mean that the expected value for x would be 0? Since the distribution has a population mean of 0? I tried using the formula to calculate E(x), but the integration was too hard and when I tried it in Maple it looked quite messy.

    For the MLE, the question says we only have one observation x. It doesn't say we have x1,x2,...xn iid observations, so that's why my MLE is just x^2.

    What did you get for E(x) and how did you calculate this?

    Thanks again for your help!
    Follow Math Help Forum on Facebook and Google+

  6. #6
    MHF Contributor
    Joined
    Sep 2012
    From
    Australia
    Posts
    3,648
    Thanks
    601

    Re: MLE,bias for a single observation from N(0,theta)

    Ohh I see that you have only one observation now which makes a lot more sense.

    With regards to E[X], that should be zero (equal to the value of mu).

    With regards to E[X^2], you can use the fact that Var[X] = E[X^2] - E[X]^2 (which you used above and that should equal theta) so in this very special case, the estimator is unbiased (but not in general). So in this special case E[theta_hat] = theta which means its unbiased.

    The other thing that is important to note is that it doesn't make sense to really talk about variance with only one sample. Usually if you want to look at variation, you look at at least two samples (If you only have one observation, what can you compare it to to get variation)?

    In this special case, we use Var[X] = E[X^2] and we know that Var[X] = theta. But if you had n independent observations, I gaurantee your estimator is going to be a bit more complex, but for this example it's not.
    Thanks from liedora
    Follow Math Help Forum on Facebook and Google+

  7. #7
    Junior Member
    Joined
    Jan 2011
    From
    Sydney
    Posts
    36

    Re: MLE,bias for a single observation from N(0,theta)

    Ok, good to know I'm on track, thanks a lot for your help!!
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. Interesting observation
    Posted in the New Users Forum
    Replies: 0
    Last Post: July 11th 2012, 10:06 AM
  2. Replies: 0
    Last Post: April 29th 2010, 09:24 AM
  3. Replies: 2
    Last Post: March 29th 2010, 06:38 AM
  4. observation, sample check please.
    Posted in the Statistics Forum
    Replies: 0
    Last Post: September 30th 2009, 06:56 PM
  5. Replies: 3
    Last Post: February 6th 2009, 03:19 PM

Search Tags


/mathhelpforum @mathhelpforum