Two independent variables X and Y each have an expected value of $\displaystyle \mu$, but different variances $\displaystyle \sigma^2$ and $\displaystyle \tau ^2$.

If the linear combination $\displaystyle Z = aX + bY$ is used to estimate $\displaystyle \mu$, then what relationship between $\displaystyle a$ and $\displaystyle b$ is necessary for this estimator to be unbiased?

I'm don't know how to prove that an estimator is unbiased, so not really sure how to start this part?

Find the simplified expression for the variance of any unbiased estimator that is a linear combination of X and Y.

I'm guessing you need the first part for this?

Use this to prove that the unbiased estimator with the lowest standard error is in the form:

$\displaystyle a = \frac{\tau^2}{\sigma^2 + \tau^2}$

$\displaystyle b = \frac{\sigma^2}{\sigma^2 + \tau^2}$

and has variance $\displaystyle \frac{\tau^2 \sigma^2}{\sigma^2 + \tau^2}$

I apologise for the lack of an attempt for this question, If someone could help me out on the firs bit and give me a little pointer I think I could give the second and third parts a go.

Thanks in advance