Hello,

Okay, I don't know at all if I did these questions correctly...

Let $\displaystyle f(x;\theta)=\begin{cases} 1-\theta & \text{ if } -1\leq x\leq 0 \\ 2\theta (1-x) & \text{ if } 0<x\leq 1 \\ 0 & \text{ if } x>1 \end{cases}$

and where $\displaystyle 0<\theta<1$

Let X be a rv with pdf f.

We know that $\displaystyle \mathbb{E}(X)=\tfrac 56 \cdot\theta-\tfrac 12$

Let $\displaystyle (X_1,\dots,X_n)$ be a sample of iid rvs following the same distribution as X.

Let the rv $\displaystyle Z_i=\begin{cases} 1 &\text{ if } X_i>0 \\ 0 &\text{ otherwise}\end{cases}$ and let $\displaystyle Y=\sum_{i=1}^n Z_i$

Preliminary questions : we proved that Y/n is an unbiased and convergent estimator for $\displaystyle \theta$.

We also proved that Y follows a binomial distribution with parameter $\displaystyle \theta$

First question : calculate an estimator T for $\displaystyle \theta$ by the method of moments.

--------------------

So for this one, ... We know that $\displaystyle \theta=\left(\mathbb{E}(X)+\tfrac 12\right) \cdot \tfrac 65$

So we can take $\displaystyle T=\left(\tfrac Yn+\tfrac 12\right) \cdot \tfrac 65$ , right ?

I thought of using observations of $\displaystyle Z_i$ but then there is a table of values for $\displaystyle X_i$, not $\displaystyle Z_i$. So I considered it was too easy...

--------------------

Second question : prove that T converges in probability to $\displaystyle \theta$ and calculate its bias

--------------------

So in order to prove that it converges, I used the LLN (the one stating the convergence in probability), then used Slutsky's theorem.

Then for its bias, I find 0... is it normal ?? This is where I am the most doubtful...

--------------------

My own questions :

- is the estimator by the method of moments unique ?

- is it always an unbiased estimator ?

it may sound stupid, but I don't know how to be sure...

Thanks in advance !