# Thread: Finding the expectation of a constant bias

1. ## Finding the estimate of a constant bias on measurements

Someone is making four measurements:
$\displaystyle y_1$, $\displaystyle y_2$ of $\displaystyle \lambda$, $\displaystyle \mu$ respectively, $\displaystyle y_3$ and $\displaystyle y_4$ both of $\displaystyle \lambda + \mu$.
The measurements are subjected to independent, normally distributed random errors with known variance $\displaystyle {\sigma}^2$.
He suspects that there is a constant bias $\displaystyle \beta$
How do you show that this bias is estimated as $\displaystyle y_1+y_2-\tfrac{1}{2}(y_3+y_4)$ ?

2. Originally Posted by mongrel73
Someone is making four measurements:
$\displaystyle y_1$, $\displaystyle y_2$ of $\displaystyle \lambda$, $\displaystyle \mu$ respectively, $\displaystyle y_3$ and $\displaystyle y_4$ both of $\displaystyle \lambda + \mu$.
The measurements are subjected to independent, normally distributed random errors with known variance $\displaystyle {\sigma}^2$.
He suspects that there is a constant bias $\displaystyle \beta$
How do you show that this bias is estimated as $\displaystyle y_1+y_2-\tfrac{1}{2}(y_3+y_4)$ ?
Suppose:

$\displaystyle y_1=\lambda+\beta+\varepsilon_1$

$\displaystyle y_2=\mu+\beta+\varepsilon_2$

$\displaystyle y_3=\lambda+ \mu +\beta+\varepsilon_3$

$\displaystyle y_3=\lambda+ \mu +\beta+\varepsilon_4$

where $\displaystyle \varepsilon_1,\ \varepsilon_2,\ \varepsilon_3,\ \varepsilon_4$ are normally distributed with zero mean.

Now write:

$\displaystyle \theta= y_1+y_2+(y_3+y_4)/2$

Now find $\displaystyle E(\theta)$ and show that this is $\displaystyle \beta$

CB

3. write:

$\displaystyle \theta= y_1+y_2-(y_3+y_4)/2$

Now find $\displaystyle E(\theta)$ and show that this is $\displaystyle \beta$
Doesn't that just show that the estimate is unbiased?
I mean, if you write $\displaystyle \theta=y_1+y_2-y_3$, then $\displaystyle E(\theta)=\beta$ as well, so that can't be sufficient.
How do you show that the one given in the question is the best estimate to use?

4. Originally Posted by mongrel73
Doesn't that just show that the estimate is unbiased?
I mean, if you write $\displaystyle \theta=y_1+y_2-y_3$, then $\displaystyle E(\theta)=\beta$ as well, so that can't be sufficient.
How do you show that the one given in the question is the best estimate to use?
1. $\displaystyle E(\theta)=\beta$ is the condition that \theta is an (unbiased ) estimator of the bias.

2. The question did not ask that you show it is the best estimate, nor specify in what sense it should be "best".

CB