1. ## New estimator

In order to estimate a population mean, $\mu$, 2 surveys were conducted independently and the statistics were noted. ( $\overline{X}_1,\overline{X}_2,\sigma _{\overline{X}_1}, \sigma _{\overline{X}_2}$ are obtained). Assume that $\overline{X_1}$ and $\overline{X_2}$ are unbaised. For some $\alpha$ and $\beta$, the two estimates can be combined to give a better estimator:

$X=\alpha \overline{X}_1 + \beta \overline{X}_2$

What choice of $\alpha$ and $\beta$ will minimize the variances, given that $\alpha + \beta =1$?

Since I managed to find one equation which is: $\alpha + \beta =1$. What is another equations which allow me to minimise the variance in order to find suitable $\alpha$ and $\beta$?

2. Originally Posted by noob mathematician
In order to estimate a population mean, $\mu$, 2 surveys were conducted independently and the statistics were noted. ( $\overline{X}_1,\overline{X}_2,\sigma _{\overline{X}_1}, \sigma _{\overline{X}_2}$ are obtained). Assume that $\overline{X_1}$ and $\overline{X_2}$ are unbaised. For some $\alpha$ and $\beta$, the two estimates can be combined to give a better estimator:

$X=\alpha \overline{X}_1 + \beta \overline{X}_2$

What choice of $\alpha$ and $\beta$ will minimize the variances, given that $\alpha + \beta =1$?

Since I managed to find one equation which is: $\alpha + \beta =1$. What is another equations which allow me to minimise the variance in order to find suitable $\alpha$ and $\beta$?
Do these surveys have the same sample size (or do we know the sample sizes), and are they otherwise identical?

CB

3. You need a relationship between the variances and are these $\sigma$'s populations st deviations or sample st deviations?
This is not complete.

4. Hi,

There are actually 2 questions on this (However there aren't much more clues to that)

Qn1: Find the conditions on $\alpha$ and $\beta$ that make the combined estimate unbiased. I get $\alpha + \beta =1$

Qn2: What choice of $\alpha$ and $\beta$ minimizes the variances, subject to the condition of unbiasedness above.
Answer I suppose: (it's true that $\alpha + \beta$ is still 1)

$\alpha= \frac{\sigma^2 _{\overline{X}_2}}{\sigma^2 _{\overline{X}_2}+\sigma^2 _{\overline{X}_1}}$, $\beta= \frac{\sigma^2 _{\overline{X}_1}}{\sigma^2 _{\overline{X}_2}+\sigma^2 _{\overline{X}_1}}$

I'm not sure whether is the population needed since we already know their statistics.

$\sigma _{\overline{X}}$ is the standard errors. So $\sigma^2 _{\overline{X}}=Var(\overline{X})$

5. Originally Posted by noob mathematician
Hi,

There are actually 2 questions on this (However there aren't much more clues to that)

Qn1: Find the conditions on $\alpha$ and $\beta$ that make the combined estimate unbiased. I get $\alpha + \beta =1$

Qn2: What choice of $\alpha$ and $\beta$ minimizes the variances, subject to the condition of unbiasedness above.
Ans I suppose: (it's true that $\alpha + \beta$ is still 1)

$\alpha= \frac{\sigma^2 _{\overline{X}_2}}{\sigma^2 _{\overline{X}_2}+\sigma^2 _{\overline{X}_1}}$, $\beta= \frac{\sigma^2 _{\overline{X}_1}}{\sigma^2 _{\overline{X}_2}+\sigma^2 _{\overline{X}_1}}$

I'm not sure whether is the population needed since we already know their statistics.

$\sigma _{\overline{X}}$ is the standard errors. So $\sigma^2 _{\overline{X}}=Var(\overline{X})$
Hi,
With the given answer: $\alpha= \frac{\sigma^2 _{\overline{X}_2}}{\sigma^2 _{\overline{X}_2}+\sigma^2 _{\overline{X}_1}}$, $\beta= \frac{\sigma^2 _{\overline{X}_1}}{\sigma^2 _{\overline{X}_2}+\sigma^2 _{\overline{X}_1}}$
I've tried to work backward in order to attain the 2nd equation:

$X=\alpha\overline{X}_1+\beta\overline{X}_2$
$Var(X)=\alpha^2 Var(\overline{X}_1)+\beta^2Var(\overline{X}_2)=\al pha^2\sigma^2 _{\overline{X}_1}+\beta^2\sigma^2 _{\overline{X}_2}$ (Equation 2)

Let $x=\sigma^2 _{\overline{X}_1}, y=\sigma^2 _{\overline{X}_2}$

Subst the answer into the 1st equation: $(\alpha +\beta=1)$

$\frac{y}{x+y}+\frac{x}{x+y}=1$ so it's valid.

Subst the answer into the 2nd equation:

$Var(X)=(\frac{y}{x+y})^2x+(\frac{x}{x+y})^2y$
$=\frac{xy^2}{(x+y)^2}+\frac{x^2y}{(x+y)^2}$
$=\frac{xy}{x+y}$

However I'm not sure why the new variance estimator becomes $\frac{xy}{x+y}$?