Originally Posted by

**Jason Bourne** Let $\displaystyle X_1,X_2, . . . ,X_n$ be independent random variables, each with unknown mean $\displaystyle \mu$ and unknown variance $\displaystyle \sigma^2$. Further, let $\displaystyle Y_1, Y_2, . . . , Y_m$ be independent random

variables (and independent of $\displaystyle X_1,X_2, . . . ,X_n$), also with mean $\displaystyle \mu$ and variance $\displaystyle \sigma^2$

(a) Show that $\displaystyle W = a\bar{X} + (1-a)\bar{Y}$ is an unbiased estimator of $\displaystyle \mu$, where $\displaystyle \bar{X}$ and $\displaystyle \bar{Y}$

are the respective sample means of the two samples.

(b) Show that $\displaystyle Var(W)=\sigma^2 \{\frac{a^{2}}{n} + \frac{(1-a)^2}{m}\}$

(c) Show that $\displaystyle Var(W) $ is minimized when $\displaystyle a=\frac{n}{n+m}$

(d) Show that

$\displaystyle S^2 = \frac{(n-1)S^2_X + (m-1)S^2_Y}{n+m-2}$

is an unbiased estimator of $\displaystyle \sigma^2$, where $\displaystyle S^2_X$ and $\displaystyle S^2_Y$ are the respective sample variances of the two samples.