# Thread: Linear Regression: parameter estimation of alpha and beta?

1. ## Linear Regression: parameter estimation of alpha and beta?

Could anyone help me with the part of this question highlighted in the link? I can derive the first part, but i'm having problems canceling alpha, any help would be greatly appreciated!

2. Originally Posted by LHS
Could anyone help me with the part of this question highlighted in the link? I can derive the first part, but i'm having problems canceling alpha, any help would be greatly appreciated!

Let Q be the function you want to minimize.

$\displaystyle
\frac {dQ}{d\alpha} = 0 \iff \underbrace{\sum \frac {y_i}{\rho_i^2}}_{A_1} - \alpha \underbrace{\sum \frac 1 {\rho_i^2}}_{B_1} - \beta \underbrace{\sum \frac{x_i}{\rho_i^2}}_{C_1} = 0
$

$\displaystyle
\frac {dQ}{d\beta} = 0 \iff \underbrace{\sum \frac {y_i x_i}{\rho_i^2}}_{A_2} - \alpha \underbrace{\sum \frac {x_i} {\rho_i^2}}_{B_2} - \beta \underbrace{\sum \frac{x_i^2}{\rho_i^2}}_{C_2} = 0
$

Just solve the simultaneous equation $A_1 - \alpha B_1 - \beta C_1 = 0$ and $A_2 - \alpha B_2 - \beta C_2 = 0$ in the usual way.

Alternatively, we might notice that in the usual matrix notation
$
Q(\vec \beta) = (Y - \vec \beta X)^T W (Y - \vec \beta X)
$

where $
W = \mbox{diag}(\rho_i^{-2}, i = 1, 2, ..., n)
$
and solve with matrix calculus.

An obvious application is where we think that the variance grows proportionally with $x_i$, so we could take $\rho_i = \sqrt {x_i}$

3. Thank you very much! that is all sorted now!