Could anyone help me with the part of this question highlighted in the link? I can derive the first part, but i'm having problems canceling alpha, any help would be greatly appreciated!

http://img847.imageshack.us/img847/8853/25298638.png

Printable View

- Mar 8th 2011, 09:11 AMLHSLinear Regression: parameter estimation of alpha and beta?
Could anyone help me with the part of this question highlighted in the link? I can derive the first part, but i'm having problems canceling alpha, any help would be greatly appreciated!

http://img847.imageshack.us/img847/8853/25298638.png - Mar 8th 2011, 11:09 AMtheodds
Let Q be the function you want to minimize.

$\displaystyle \displaystyle

\frac {dQ}{d\alpha} = 0 \iff \underbrace{\sum \frac {y_i}{\rho_i^2}}_{A_1} - \alpha \underbrace{\sum \frac 1 {\rho_i^2}}_{B_1} - \beta \underbrace{\sum \frac{x_i}{\rho_i^2}}_{C_1} = 0

$

$\displaystyle \displaystyle

\frac {dQ}{d\beta} = 0 \iff \underbrace{\sum \frac {y_i x_i}{\rho_i^2}}_{A_2} - \alpha \underbrace{\sum \frac {x_i} {\rho_i^2}}_{B_2} - \beta \underbrace{\sum \frac{x_i^2}{\rho_i^2}}_{C_2} = 0

$

Just solve the simultaneous equation $\displaystyle A_1 - \alpha B_1 - \beta C_1 = 0$ and $\displaystyle A_2 - \alpha B_2 - \beta C_2 = 0$ in the usual way.

Alternatively, we might notice that in the usual matrix notation

$\displaystyle

Q(\vec \beta) = (Y - \vec \beta X)^T W (Y - \vec \beta X)

$

where $\displaystyle

W = \mbox{diag}(\rho_i^{-2}, i = 1, 2, ..., n)

$ and solve with matrix calculus.

An obvious application is where we think that the variance grows proportionally with $\displaystyle x_i$, so we could take $\displaystyle \rho_i = \sqrt {x_i}$ - Mar 9th 2011, 07:17 AMLHS
Thank you very much! that is all sorted now!