# unbiased estimators

• Mar 8th 2008, 07:48 AM
Jason Bourne
unbiased estimators
Let $\displaystyle X_1,X_2, . . . ,X_n$ be independent random variables, each with unknown mean $\displaystyle \mu$ and unknown variance $\displaystyle \sigma^2$. Further, let $\displaystyle Y_1, Y_2, . . . , Y_m$ be independent random
variables (and independent of $\displaystyle X_1,X_2, . . . ,X_n$), also with mean $\displaystyle \mu$ and variance $\displaystyle \sigma^2$

(a) Show that $\displaystyle W = a\bar{X} + (1-a)\bar{Y}$ is an unbiased estimator of $\displaystyle \mu$, where $\displaystyle \bar{X}$ and $\displaystyle \bar{Y}$
are the respective sample means of the two samples.

(b) Show that $\displaystyle Var(W)=\sigma^2 \{\frac{a^{2}}{n} + \frac{(1-a)^2}{m}\}$

(c) Show that $\displaystyle Var(W)$ is minimized when $\displaystyle a=\frac{n}{n+m}$

(d) Show that

$\displaystyle S^2 = \frac{(n-1)S^2_X + (m-1)S^2_Y}{n+m-2}$

is an unbiased estimator of $\displaystyle \sigma^2$, where $\displaystyle S^2_X$ and $\displaystyle S^2_Y$ are the respective sample variances of the two samples.
• Mar 8th 2008, 09:35 AM
CaptainBlack
Quote:

Originally Posted by Jason Bourne
Let $\displaystyle X_1,X_2, . . . ,X_n$ be independent random variables, each with unknown mean $\displaystyle \mu$ and unknown variance $\displaystyle \sigma^2$. Further, let $\displaystyle Y_1, Y_2, . . . , Y_m$ be independent random
variables (and independent of $\displaystyle X_1,X_2, . . . ,X_n$), also with mean $\displaystyle \mu$ and variance $\displaystyle \sigma^2$

(a) Show that $\displaystyle W = a\bar{X} + (1-a)\bar{Y}$ is an unbiased estimator of $\displaystyle \mu$, where $\displaystyle \bar{X}$ and $\displaystyle \bar{Y}$
are the respective sample means of the two samples.

(b) Show that $\displaystyle Var(W)=\sigma^2 \{\frac{a^{2}}{n} + \frac{(1-a)^2}{m}\}$

(c) Show that $\displaystyle Var(W)$ is minimized when $\displaystyle a=\frac{n}{n+m}$

(d) Show that

$\displaystyle S^2 = \frac{(n-1)S^2_X + (m-1)S^2_Y}{n+m-2}$

is an unbiased estimator of $\displaystyle \sigma^2$, where $\displaystyle S^2_X$ and $\displaystyle S^2_Y$ are the respective sample variances of the two samples.

Are you sure that the $\displaystyle X_i$ 's are not supposed to be iid RV's?
(same for the $\displaystyle Y_i$ 's)

If they were then your problem is to show that $\displaystyle E(W)=\mu$ given that $\displaystyle E(\bar{X})=\mu$, and $\displaystyle E(\bar{Y})=\mu$, and $\displaystyle E(S^2)=\sigma^2$, given $\displaystyle E(S_X)=\sigma^2$ and $\displaystyle E(S_Y)=\sigma^2$

(though come to think of it this almost certainy works even if we drop the assumption of identical distributions)

RonL
• Mar 8th 2008, 02:32 PM
Jason Bourne
Quote:

Originally Posted by CaptainBlack
Are you sure that the $\displaystyle X_i$ 's are not supposed to be iid RV's?

I don't know what you mean by "iid RV's". The problem is exactly as I have stated it.

Quote:

Originally Posted by CaptainBlack
If they were then your problem is to show that $\displaystyle E(W)=\mu$ given that $\displaystyle E(\bar{X})=\mu$, and $\displaystyle E(\bar{Y})=\mu$, and $\displaystyle E(S^2)=\sigma^2$, given $\displaystyle E(S_X)=\sigma^2$ and $\displaystyle E(S_Y)=\sigma^2$

This seems like along the right lines for this part of the question thanks.
• Mar 9th 2008, 01:22 AM
CaptainBlack
Quote:

Originally Posted by Jason Bourne
I don't know what you mean by "iid RV's". The problem is exactly as I have stated it.

iid RV's: independent identicaly distributed random variables

RonL