1. ## Estimators...

Hello,

Okay, I don't know at all if I did these questions correctly...

Let $\displaystyle f(x;\theta)=\begin{cases} 1-\theta & \text{ if } -1\leq x\leq 0 \\ 2\theta (1-x) & \text{ if } 0<x\leq 1 \\ 0 & \text{ if } x>1 \end{cases}$

and where $\displaystyle 0<\theta<1$

Let X be a rv with pdf f.
We know that $\displaystyle \mathbb{E}(X)=\tfrac 56 \cdot\theta-\tfrac 12$

Let $\displaystyle (X_1,\dots,X_n)$ be a sample of iid rvs following the same distribution as X.

Let the rv $\displaystyle Z_i=\begin{cases} 1 &\text{ if } X_i>0 \\ 0 &\text{ otherwise}\end{cases}$ and let $\displaystyle Y=\sum_{i=1}^n Z_i$

Preliminary questions : we proved that Y/n is an unbiased and convergent estimator for $\displaystyle \theta$.
We also proved that Y follows a binomial distribution with parameter $\displaystyle \theta$

First question : calculate an estimator T for $\displaystyle \theta$ by the method of moments.
--------------------
So for this one, ... We know that $\displaystyle \theta=\left(\mathbb{E}(X)+\tfrac 12\right) \cdot \tfrac 65$

So we can take $\displaystyle T=\left(\tfrac Yn+\tfrac 12\right) \cdot \tfrac 65$ , right ?

I thought of using observations of $\displaystyle Z_i$ but then there is a table of values for $\displaystyle X_i$, not $\displaystyle Z_i$. So I considered it was too easy...

--------------------

Second question : prove that T converges in probability to $\displaystyle \theta$ and calculate its bias
--------------------
So in order to prove that it converges, I used the LLN (the one stating the convergence in probability), then used Slutsky's theorem.

Then for its bias, I find 0... is it normal ?? This is where I am the most doubtful...

--------------------

My own questions :
- is the estimator by the method of moments unique ?
- is it always an unbiased estimator ?

it may sound stupid, but I don't know how to be sure...

2. do you mean consistent estimator for ?

And f(x)= 0 for x<-1 too

Your mean of X is correct.
But I would think that the MOM estimator of theta would be the solution to

$\displaystyle {5\over 6}\hat\theta-{1\over 2}={\sum_{i=1}^nXi\over n}=\bar X$

You're using the Z's when you use that Y, those are the truncated X's.
I'm just setting the population mean of the X's to IT's sample mean.

Since

$\displaystyle \bar X \buildrel P\over\to {5\over 6}\theta -{1\over 2}$

$\displaystyle \hat\theta={6\over 5}\biggl(\bar X +{1\over 2}\biggr)\buildrel P\over\to \theta$

3. IF you want to use the Y in the MOM estimation, you should set

$\displaystyle \Hat E(Z_i)={\sum_{i=1}^n Z_i\over n}={Y\over n}$

Now $\displaystyle E(Z_i)=P\{X_i>0\}=2\theta \int_0^1 (1-x)dx=\theta$

In that case you get $\displaystyle \Hat\theta={Y\over n}$