Results 1 to 3 of 3

Math Help - Estimators...

  1. #1
    Moo
    Moo is offline
    A Cute Angle Moo's Avatar
    Joined
    Mar 2008
    From
    P(I'm here)=1/3, P(I'm there)=t+1/3
    Posts
    5,618
    Thanks
    6

    Estimators...

    Hello,

    Okay, I don't know at all if I did these questions correctly...

    Let f(x;\theta)=\begin{cases} 1-\theta & \text{ if } -1\leq x\leq 0 \\ 2\theta (1-x) & \text{ if } 0<x\leq 1 \\ 0 & \text{ if } x>1 \end{cases}

    and where 0<\theta<1

    Let X be a rv with pdf f.
    We know that \mathbb{E}(X)=\tfrac 56 \cdot\theta-\tfrac 12

    Let (X_1,\dots,X_n) be a sample of iid rvs following the same distribution as X.

    Let the rv Z_i=\begin{cases} 1 &\text{ if } X_i>0 \\ 0 &\text{ otherwise}\end{cases} and let Y=\sum_{i=1}^n Z_i


    Preliminary questions : we proved that Y/n is an unbiased and convergent estimator for \theta.
    We also proved that Y follows a binomial distribution with parameter \theta


    First question : calculate an estimator T for \theta by the method of moments.
    --------------------
    So for this one, ... We know that \theta=\left(\mathbb{E}(X)+\tfrac 12\right) \cdot \tfrac 65

    So we can take T=\left(\tfrac Yn+\tfrac 12\right) \cdot \tfrac 65 , right ?

    I thought of using observations of Z_i but then there is a table of values for X_i, not Z_i. So I considered it was too easy...

    --------------------


    Second question : prove that T converges in probability to \theta and calculate its bias
    --------------------
    So in order to prove that it converges, I used the LLN (the one stating the convergence in probability), then used Slutsky's theorem.

    Then for its bias, I find 0... is it normal ?? This is where I am the most doubtful...

    --------------------


    My own questions :
    - is the estimator by the method of moments unique ?
    - is it always an unbiased estimator ?

    it may sound stupid, but I don't know how to be sure...



    Thanks in advance !
    Follow Math Help Forum on Facebook and Google+

  2. #2
    MHF Contributor matheagle's Avatar
    Joined
    Feb 2009
    Posts
    2,763
    Thanks
    5
    do you mean consistent estimator for ?

    And f(x)= 0 for x<-1 too

    Your mean of X is correct.
    But I would think that the MOM estimator of theta would be the solution to

    {5\over 6}\hat\theta-{1\over 2}={\sum_{i=1}^nXi\over n}=\bar X

    You're using the Z's when you use that Y, those are the truncated X's.
    I'm just setting the population mean of the X's to IT's sample mean.


    Since

    \bar X \buildrel P\over\to {5\over 6}\theta -{1\over 2}

    \hat\theta={6\over 5}\biggl(\bar X +{1\over 2}\biggr)\buildrel P\over\to \theta
    Last edited by matheagle; August 21st 2009 at 12:22 AM.
    Follow Math Help Forum on Facebook and Google+

  3. #3
    MHF Contributor matheagle's Avatar
    Joined
    Feb 2009
    Posts
    2,763
    Thanks
    5
    IF you want to use the Y in the MOM estimation, you should set

    \Hat E(Z_i)={\sum_{i=1}^n Z_i\over n}={Y\over n}

    Now E(Z_i)=P\{X_i>0\}=2\theta \int_0^1 (1-x)dx=\theta

    In that case you get \Hat\theta={Y\over n}
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. Estimators
    Posted in the Advanced Statistics Forum
    Replies: 1
    Last Post: November 8th 2010, 04:34 PM
  2. estimators
    Posted in the Advanced Statistics Forum
    Replies: 0
    Last Post: October 31st 2010, 12:36 PM
  3. estimators
    Posted in the Advanced Statistics Forum
    Replies: 2
    Last Post: May 1st 2009, 12:26 AM
  4. estimators
    Posted in the Advanced Statistics Forum
    Replies: 4
    Last Post: April 30th 2008, 07:30 AM
  5. Estimators
    Posted in the Advanced Statistics Forum
    Replies: 0
    Last Post: February 1st 2007, 07:44 AM

Search Tags


/mathhelpforum @mathhelpforum