# Estimators Question

• Feb 12th 2010, 08:01 AM
craig
Estimators Question
Two independent variables X and Y each have an expected value of $\displaystyle \mu$, but different variances $\displaystyle \sigma^2$ and $\displaystyle \tau ^2$.

If the linear combination $\displaystyle Z = aX + bY$ is used to estimate $\displaystyle \mu$, then what relationship between $\displaystyle a$ and $\displaystyle b$ is necessary for this estimator to be unbiased?

I'm don't know how to prove that an estimator is unbiased, so not really sure how to start this part?

Find the simplified expression for the variance of any unbiased estimator that is a linear combination of X and Y.

I'm guessing you need the first part for this?

Use this to prove that the unbiased estimator with the lowest standard error is in the form:

$\displaystyle a = \frac{\tau^2}{\sigma^2 + \tau^2}$

$\displaystyle b = \frac{\sigma^2}{\sigma^2 + \tau^2}$

and has variance $\displaystyle \frac{\tau^2 \sigma^2}{\sigma^2 + \tau^2}$

I apologise for the lack of an attempt for this question, If someone could help me out on the firs bit and give me a little pointer I think I could give the second and third parts a go.

• Feb 13th 2010, 05:33 AM
Moo
Hello,

Question 1)
In order to prove that an estimator $\displaystyle \hat \theta$ is unbiased, you have to check that $\displaystyle E[\hat\theta]=\mu$
So a and b are such that $\displaystyle a+b=1$

Question 2)
Any unbiased estimator will be in the form $\displaystyle aX+(1-a)Y$
So its variance is : $\displaystyle var(aX+(1-a)Y)=a^2 var(X)+(1-a)^2var(Y)$, since X and Y are independent.

Question 3)
It's your turn to work (and I don't know how to do it :p)
• Feb 13th 2010, 07:15 AM
matheagle
To be unbiased you need the expected value of the random variable to equal MOO.

So $\displaystyle \mu=E(Z)=a(X)+bE(Y)=a\mu+b\mu$

That means that a+b=1.

Next, you minimize $\displaystyle V(Z)=a^2V(X)+(1-a)^2V(Y)=a^2\sigma^2+(1-a)^2\tau^2$ wrt a via calc one.

---------------------------------------------------
Someone changed their "solution"
• Feb 14th 2010, 08:10 AM
craig
Thanks for both replies, make a little more sense now ;)

Still not sure what to do on the last bit, as you both said above the variance is in the form $\displaystyle a^2 \sigma^2+(1-a)^2\tau^2$, but not sure how that gets to $\displaystyle \frac{\tau^2 \sigma^2}{\sigma^2 + \tau^2}$, or how to isolate a and b?

Thanks again for the help so far, if anyone else can shed any light on the last bit it would be much appreciated ;)
• Feb 14th 2010, 08:13 AM
matheagle
as I already said, replace b with 1-a and use calculus.......................

Next, you minimize $\displaystyle V(Z)=a^2V(X)+(1-a)^2V(Y)=a^2\sigma^2+(1-a)^2\tau^2$ wrt a via calc one.
• Feb 15th 2010, 12:10 AM
craig
Quote:

Originally Posted by matheagle
as I already said, replace b with 1-a and use calculus.......................

Next, you minimize $\displaystyle V(Z)=a^2V(X)+(1-a)^2V(Y)=a^2\sigma^2+(1-a)^2\tau^2$ wrt a via calc one.

Sorry I didn't see that on your last post.

Differentiate wrt a, becomes a simple linear equation to solve :)

Thanks a lot for the help here, when it said minimise it was trying to set the original equation to zero, didn't occur to me to find the minimum value through differentiation.