# One parameter least-squares

• Aug 15th 2007, 06:33 PM
tom88
One parameter least-squares
Hi,

Just a quick question about the analytical solution to the simple one-parameter least squares problem, i.e. y=mx.

If we assume the error (sigma) is the same for all points, it's easy to show that the solutions for m and its variance are:

m=MEAN(XY)/MEAN(X^2)

V(m)=(sigma^2)/(N*mean(X^2))

But what happens when we have different errors on all points, so instead of sigma we have sigma(i) for each y(i). What do these formulas become?

Thanks for your help!
• Aug 15th 2007, 08:47 PM
JakeD
Quote:

Originally Posted by tom88
Hi,

Just a quick question about the analytical solution to the simple one-parameter least squares problem, i.e. y=mx.

If we assume the error (sigma) is the same for all points, it's easy to show that the solutions for m and its variance are:

m=MEAN(XY)/MEAN(X^2)

V(m)=(sigma^2)/(N*mean(X^2))

But what happens when we have different errors on all points, so instead of sigma we have sigma(i) for each y(i). What do these formulas become?

Thanks for your help!

To get the formula you have for the variance, you calculated, either explicitly or implicitly,

$E((\sum_i X_i \epsilon)^2) = E((\epsilon \sum_i X_i)^2) = E(\epsilon^2 (\sum_i X_i)^2) = E(\epsilon^2) (\sum_i X_i)^2 = \sigma^2 (\sum_i X_i)^2.$

When the errors are different on all points, this becomes

$E((\sum_i X_i \epsilon_i)^2) = E(\sum_i \sum_j X_i\epsilon_i X_j \epsilon_j) = \sum_i \sum_j X_i X_j E(\epsilon_i \epsilon_j) = \sigma^2 (\sum_i X_i)^2,$

which is the same result. Can you justify the last step?
• Aug 15th 2007, 10:50 PM
tom88
Thanks for your reply. I'm slightly confused though. The result you have is slightly different to mine, because you effectively have (sum of x)^2 rather than (sum of x^2). Also, are you saying that the variance does not change when the individual errors are used? Surely it must?
• Aug 15th 2007, 11:19 PM
JakeD
Quote:

Originally Posted by tom88
Thanks for your reply. I'm slightly confused though. The result you have is slightly different to mine, because you effectively have (sum of x)^2 rather than (sum of x^2). Also, are you saying that the variance does not change when the individual errors are used? Surely it must?

Since your result is different, how about showing your derivation? Maybe this isn't so quick and easy (at least for me).
• Aug 16th 2007, 12:27 AM
CaptainBlack
You have a set of $x_i$ and the corresponding observed values of the dependent variable $y,\ y_i=M x_i+\epsilon_i$. We wish
to find an estimate $m$ of $M$ which minimises:

$
O=\sum (y_i-m x_i)^2/\sigma_i^2
$

where the $\sigma_i^2$ are known variances of the $\epsilon_i$'s (which we assume are independent).

So we solve:

$
\frac{dO}{dm}=\sum \frac{2(y_i-mx_i)x_i}{\sigma_i^2}=0
$

Which has solution:

$
m=\frac{\sum y_i x_i/\sigma_i^2}{\sum x_i^2/\sigma_i^2}
$

which has variance:

$
var(m)=\frac{\sum x_i^2/\sigma_i^2}{(\sum x_i^2/\sigma_i^2)^2}=\frac{1}{\sum x_i^2/\sigma_i^2}
$

Please check this it is quite likely that I have an error somewhere.

RonL
• Aug 16th 2007, 02:29 AM
JakeD
Quote:

Originally Posted by CaptainBlack
You have a set of $x_i$ and the corresponding observed values of the dependent variable $y,\ y_i=M x_i+\epsilon_i$. We wish
to find an estimate $m$ of $M$ which minimises:

$
O=\sum (y_i-m x_i)^2/\sigma_i^2
$

where the $\sigma_i^2$ are known variances of the $\epsilon_i$'s (which we assume are independent).

So we solve:

$
\frac{dO}{dm}=\sum \frac{2(y_i-mx_i)x_i}{\sigma_i^2}=0
$

Which has solution:

$
m=\frac{\sum y_i x_i/\sigma_i^2}{\sum x_i^2/\sigma_i^2}
$

which has variance:

$
var(m)=\frac{\sum x_i^2/\sigma_i^2}{(\sum x_i^2/\sigma_i^2)^2}=\frac{1}{\sum x_i^2/\sigma_i^2}
$

Please check this it is quite likely that I have an error somewhere.

RonL

I see no errors. But we all have different setups and derivations in mind. E.g., your setup uses different variances, mine assumes they are all the same. I asked tom88 to show his setup and derivation because that would be the easier for him to work with instead of trying to reconcile his derivation with mine.
• Aug 16th 2007, 07:04 PM
tom88
Thank you both for your help with this. The results dervied by Captain Black are the ones I was looking for, and agree with what have now derived myself, so it looks as if they are right!

Sorry for any confusion, Jake. I didn't really understand either my setup or derivation until now!