# sampling distributions of estimators

• Apr 10th 2011, 10:49 PM
holly123
sampling distributions of estimators
suppose that a random variable X has a geometric distribution for which the parameter p is unknown (0<p<1). Show that the only unbiased estimator of p is the estimator delta(X) such that delta(0)=1 and delta(X)=0 for X>0

any clues on this?? thank you so much in advance
• Apr 11th 2011, 03:50 AM
Sambit
Quote:

Originally Posted by SpringFan25
suppose you have 1 realisation of x and your estimator is c(x)

For your estimator to be unbiased you require:
$\displaystyle E(c(x)) = p$

$\displaystyle \sum c(x) \times p(1-p)^{x-1} =p$
where the summation is taken over all possible values of x (1,2,3,4,5,6...)

Now, consider the following function:

c(x)=1 if x=1
c(x)=0 otherwise

The expected value of our function is then

$\displaystyle \sum c(x) \times p(1-p)^x =p$
$\displaystyle =(f(1)\times p) + (f(2)\times p(1-p)) + (f(3)\times p(1-p)^2) +...$
$\displaystyle =(1 \times p) + (0 \times p(1-p)) + (0 \times p(1-p)^2) +...$
$\displaystyle =p$
as required

So your estimator is $\displaystyle c(x)$ where c(x) was defined above.

In reality, this estimator is no practical use. But it is an unbiased which is all the question asked for. I did assume that your sample is a single realisation from the distribution. if you have a sample with multiple data points, just discard all but one of them and the proceedure still works.

• Apr 11th 2011, 04:42 AM
CaptainBlack
Quote:

Originally Posted by Sambit

But that post did not answer the question which was to show that the given estimator was the unique unbiased estimator, that post just showed that the given estimator was unbiased (and there is a typo in the latter part of the post).

It looks like the proof relies on the uniquness of a power series expansion of a (constant) function on the closed unit interval.

CB
• Apr 11th 2011, 05:40 AM
theodds
Quote:

Originally Posted by CaptainBlack
But that post did not answer the question which was to show that the given estimator was the unique unbiased estimator, that post just showed that the given estimator was unbiased (and there is a typo in the latter part of the post).

It looks like the proof relies on the uniquness of a power series expansion of a (constant) function on the closed unit interval.

CB

X is complete, so all functions of X are almost-surely unique unbiased for their expectations. If OP can appeal to completeness, then the problem is done.
• Apr 11th 2011, 05:48 AM
CaptainBlack
Quote:

Originally Posted by theodds
X is complete, so all functions of X are almost-surely unique unbiased for their expectations. If OP can appeal to completeness, then the problem is done.

That makes no sense at all.

CB
• Apr 11th 2011, 05:54 AM
theodds
What part exactly doesn't make sense? X is distributed according to an exponential family of full rank, so X is complete, and because X is complete it only admits one unbiased estimator of p....
• Apr 11th 2011, 07:09 PM
Sambit
Quote:

Originally Posted by CaptainBlack
That makes no sense at all.

CB

I don't understand why. If you go through that page you will see the proof of MVUE is done there.