1 Attachment(s)

Biased-ness of OLS estimates

http://img84.imageshack.us/img84/3668/bias2.jpg

If the picture is too small, please view attached pic.

Hey guys, so I'm just reading through the proof of the biased-ness of OLS estimates if an important independent variable is left out and I'm just stuck at the very end, basically the mathematical part that confuses me is the part circled in red.

When we take the expectation of $\displaystyle \frac{\sum_{i=1}^n (x_{i1} - \bar{x_1})(x_{i2} - \bar{x_2})}{\sum_{i=1}^{n}\left((x_{i1}-\bar{x_1})^2\right)}$ how does it simply end as the same thing?

Ie, Why is $\displaystyle E\left[\frac{\sum_{i=1}^n (x_{i1} - \bar{x_1})(x_{i2} - \bar{x_2})}{\sum_{i=1}^{n}\left((x_{i1}-\bar{x_1})^2\right)}\right] = \frac{\sum_{i=1}^n (x_{i1} - \bar{x_1})(x_{i2} - \bar{x_2})}{\sum_{i=1}^{n}\left((x_{i1}-\bar{x_1})^2\right)}$ ?

Since $\displaystyle \frac{\sum_{i=1}^n (x_{i1} - \bar{x_1})(x_{i2} - \bar{x_2})}{\sum_{i=1}^{n}\left((x_{i1}-\bar{x_1})^2\right)}$ isn't a constant as it depends on $\displaystyle n$ so it's like a random variable which outputs different values as $\displaystyle n$ changes, so why does it follow the rule that $\displaystyle E(c) = c$ where $\displaystyle c$ is a constant?

Thanks :)