Results 1 to 7 of 7

Math Help - One parameter least-squares

  1. #1
    Newbie
    Joined
    Aug 2007
    Posts
    3

    One parameter least-squares

    Hi,

    Just a quick question about the analytical solution to the simple one-parameter least squares problem, i.e. y=mx.

    If we assume the error (sigma) is the same for all points, it's easy to show that the solutions for m and its variance are:

    m=MEAN(XY)/MEAN(X^2)

    V(m)=(sigma^2)/(N*mean(X^2))

    But what happens when we have different errors on all points, so instead of sigma we have sigma(i) for each y(i). What do these formulas become?

    Thanks for your help!
    Follow Math Help Forum on Facebook and Google+

  2. #2
    Senior Member
    Joined
    Apr 2006
    Posts
    399
    Awards
    1
    Quote Originally Posted by tom88 View Post
    Hi,

    Just a quick question about the analytical solution to the simple one-parameter least squares problem, i.e. y=mx.

    If we assume the error (sigma) is the same for all points, it's easy to show that the solutions for m and its variance are:

    m=MEAN(XY)/MEAN(X^2)

    V(m)=(sigma^2)/(N*mean(X^2))

    But what happens when we have different errors on all points, so instead of sigma we have sigma(i) for each y(i). What do these formulas become?

    Thanks for your help!
    To get the formula you have for the variance, you calculated, either explicitly or implicitly,

    E((\sum_i X_i \epsilon)^2) = E((\epsilon \sum_i X_i)^2) = E(\epsilon^2 (\sum_i X_i)^2) = E(\epsilon^2) (\sum_i X_i)^2 = \sigma^2 (\sum_i X_i)^2.

    When the errors are different on all points, this becomes

    E((\sum_i X_i \epsilon_i)^2) = E(\sum_i \sum_j X_i\epsilon_i X_j \epsilon_j) = \sum_i \sum_j X_i X_j E(\epsilon_i \epsilon_j) = \sigma^2 (\sum_i X_i)^2,

    which is the same result. Can you justify the last step?
    Follow Math Help Forum on Facebook and Google+

  3. #3
    Newbie
    Joined
    Aug 2007
    Posts
    3
    Thanks for your reply. I'm slightly confused though. The result you have is slightly different to mine, because you effectively have (sum of x)^2 rather than (sum of x^2). Also, are you saying that the variance does not change when the individual errors are used? Surely it must?
    Follow Math Help Forum on Facebook and Google+

  4. #4
    Senior Member
    Joined
    Apr 2006
    Posts
    399
    Awards
    1
    Quote Originally Posted by tom88 View Post
    Thanks for your reply. I'm slightly confused though. The result you have is slightly different to mine, because you effectively have (sum of x)^2 rather than (sum of x^2). Also, are you saying that the variance does not change when the individual errors are used? Surely it must?
    Since your result is different, how about showing your derivation? Maybe this isn't so quick and easy (at least for me).
    Last edited by JakeD; August 15th 2007 at 10:29 PM.
    Follow Math Help Forum on Facebook and Google+

  5. #5
    Grand Panjandrum
    Joined
    Nov 2005
    From
    someplace
    Posts
    14,972
    Thanks
    4
    You have a set of x_i and the corresponding observed values of the dependent variable y,\ y_i=M x_i+\epsilon_i. We wish
    to find an estimate m of M which minimises:

    <br />
O=\sum (y_i-m x_i)^2/\sigma_i^2<br />

    where the \sigma_i^2 are known variances of the \epsilon_i's (which we assume are independent).

    So we solve:

    <br />
\frac{dO}{dm}=\sum \frac{2(y_i-mx_i)x_i}{\sigma_i^2}=0<br />

    Which has solution:

    <br />
m=\frac{\sum y_i x_i/\sigma_i^2}{\sum x_i^2/\sigma_i^2}<br />

    which has variance:

    <br />
var(m)=\frac{\sum x_i^2/\sigma_i^2}{(\sum x_i^2/\sigma_i^2)^2}=\frac{1}{\sum x_i^2/\sigma_i^2}<br />

    Please check this it is quite likely that I have an error somewhere.

    RonL
    Follow Math Help Forum on Facebook and Google+

  6. #6
    Senior Member
    Joined
    Apr 2006
    Posts
    399
    Awards
    1
    Quote Originally Posted by CaptainBlack View Post
    You have a set of x_i and the corresponding observed values of the dependent variable y,\ y_i=M x_i+\epsilon_i. We wish
    to find an estimate m of M which minimises:

    <br />
O=\sum (y_i-m x_i)^2/\sigma_i^2<br />

    where the \sigma_i^2 are known variances of the \epsilon_i's (which we assume are independent).

    So we solve:

    <br />
\frac{dO}{dm}=\sum \frac{2(y_i-mx_i)x_i}{\sigma_i^2}=0<br />

    Which has solution:

    <br />
m=\frac{\sum y_i x_i/\sigma_i^2}{\sum x_i^2/\sigma_i^2}<br />

    which has variance:

    <br />
var(m)=\frac{\sum x_i^2/\sigma_i^2}{(\sum x_i^2/\sigma_i^2)^2}=\frac{1}{\sum x_i^2/\sigma_i^2}<br />

    Please check this it is quite likely that I have an error somewhere.

    RonL
    I see no errors. But we all have different setups and derivations in mind. E.g., your setup uses different variances, mine assumes they are all the same. I asked tom88 to show his setup and derivation because that would be the easier for him to work with instead of trying to reconcile his derivation with mine.
    Follow Math Help Forum on Facebook and Google+

  7. #7
    Newbie
    Joined
    Aug 2007
    Posts
    3
    Thank you both for your help with this. The results dervied by Captain Black are the ones I was looking for, and agree with what have now derived myself, so it looks as if they are right!

    Sorry for any confusion, Jake. I didn't really understand either my setup or derivation until now!
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. parameter help...
    Posted in the Calculus Forum
    Replies: 1
    Last Post: November 16th 2011, 05:17 AM
  2. parameter
    Posted in the Differential Geometry Forum
    Replies: 1
    Last Post: April 12th 2011, 11:52 PM
  3. hypothesis testing: parameter 0 against parameter positive.
    Posted in the Advanced Statistics Forum
    Replies: 0
    Last Post: February 10th 2011, 02:49 PM
  4. Magic squares of Squares
    Posted in the Math Puzzles Forum
    Replies: 5
    Last Post: September 22nd 2010, 09:58 AM
  5. Replies: 4
    Last Post: November 13th 2009, 05:12 PM

Search Tags


/mathhelpforum @mathhelpforum