Results 1 to 5 of 5

Math Help - How to prove consistency of the biased MLE estimator for sigma^2

  1. #1
    Newbie
    Joined
    Apr 2010
    Posts
    4

    How to prove consistency of the biased MLE estimator for sigma^2

    Hi guys,
    I have a small math problem
    I have a sample of size N iid normal distributed variables. Then I get the MLE for mu and sigma^2(but the biased estimater for sigma^2, the one that is 1/N*(theRest) ). Now I have to prove consistecy of them both. Proving that for the unbiased estimator of mu is easy using Chebyshev.
    My problem is proving the consistency of the biased MLE estimator of sigma^2.
    Can anybody help me?
    Cheers
    Follow Math Help Forum on Facebook and Google+

  2. #2
    Super Member Anonymous1's Avatar
    Joined
    Nov 2009
    From
    Big Red, NY
    Posts
    517
    Thanks
    1
    Quote Originally Posted by pimponi View Post
    Hi guys,
    I have a small math problem
    I have a sample of size N iid normal distributed variables. Then I get the MLE for mu and sigma^2(but the biased estimater for sigma^2, the one that is 1/N*(theRest) ). Now I have to prove consistecy of them both. Proving that for the unbiased estimator of mu is easy using Chebyshev.
    My problem is proving the consistency of the biased MLE estimator of sigma^2.
    Can anybody help me?
    Cheers
    What exactly do you mean by consistency?
    Follow Math Help Forum on Facebook and Google+

  3. #3
    Newbie
    Joined
    Apr 2010
    Posts
    4
    Quote Originally Posted by Anonymous1 View Post
    What exactly do you mean by consistency?
    That the ML estimated parametar converges with probability to the real value of the parameter
    Follow Math Help Forum on Facebook and Google+

  4. #4
    MHF Contributor matheagle's Avatar
    Joined
    Feb 2009
    Posts
    2,763
    Thanks
    5
    chebyshev is not the way to go.
    you would need a fourth moment in that case
    You can obtain STRONG consistency by the Strong Law of Large Numbers.

     {\sum (X_i-\bar X)^2\over n} ={\sum X_i^2\over n}-(\bar X)^2\to E(X^2)-(\mu)^2=\sigma^2 almost surely

    And almost sure implies convergence in prob, but we do have strong consistency here

    AND I used \bar X\to \mu a.s. which doesn't need cheby either.
    But I can prove these with cheby, but you will need a fourth moment
    And we don't need normality either here.
    Last edited by matheagle; April 28th 2010 at 09:21 PM.
    Follow Math Help Forum on Facebook and Google+

  5. #5
    Newbie
    Joined
    Apr 2010
    Posts
    4
    Quote Originally Posted by matheagle View Post
    chebyshev is not the way to go.
    you would need a fourth moment in that case
    You can obtain STRONG consistency by the Strong Law of Large Numbers.

     {\sum (X_i-\bar X)^2\over n} ={\sum X_i^2\over n}-(\bar X)^2\to E(X^2)-(\mu)^2=\sigma^2 almost surely

    And almost sure implies convergence in prob, but we do have strong consistency here

    AND I used \bar X\to \mu a.s. which doesn't need cheby either.
    But I can prove these with cheby, but you will need a fourth moment
    And we don't need normality either here.
    Thanks a lot.
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. Prove that b is an unbiased estimator of B
    Posted in the Advanced Statistics Forum
    Replies: 0
    Last Post: December 6th 2010, 07:05 PM
  2. Show intersection of sigma-algebras is again a sigma-algebra
    Posted in the Advanced Statistics Forum
    Replies: 1
    Last Post: November 20th 2010, 07:21 AM
  3. [SOLVED] maximum likelihood estimator biased not unbiased
    Posted in the Calculus Forum
    Replies: 1
    Last Post: August 16th 2010, 03:33 AM
  4. determing unbiased or biased estimator
    Posted in the Advanced Statistics Forum
    Replies: 4
    Last Post: June 5th 2009, 01:58 PM
  5. Prove By Induction (Sigma Style)
    Posted in the Discrete Math Forum
    Replies: 2
    Last Post: September 24th 2007, 04:42 PM

Search Tags


/mathhelpforum @mathhelpforum