Results 1 to 4 of 4

Math Help - Help with non-linear least squares

  1. #1
    Newbie
    Joined
    Aug 2010
    Posts
    8

    Help with non-linear least squares

    I'm trying to find least squares coefficients using the Gauss-Newton method (using scilab as my numerical analysis software)

    Given the following data vectors:

    x = [57.9 108.2 149.6 227.9 778.3 1429.4] ' ;
    y = [0.2 0.6 1.0 1.9 11.9 29.5] ' ;

    (The ' sign means transposed in scilab notation, so they are column vectors)

    I'm trying to adjust the data to the following equation:

    y = ax^b
    (you will find this to be Kepler's third law on planetary motion)

    ,where a and b are the coefficients to find.

    I'm using the following code to find \theta = (a, b), and an initial value of: \theta_0 = (0.000426, 1.53917)

    So here's the problem. In the first iteration, the Jacobian grows to be a very badly scaled matrix:

    Help with non-linear least squares-untitled.png

    note that the left column is x^p, which makes for a huge number compared to the right column, making the matrix very badly scaled, so i never get to the solution.

    I've tried everything from QR decomposition to logarithmic scaling, but nothing seems to work for me.

    Any help will be much appreciated.
    Follow Math Help Forum on Facebook and Google+

  2. #2
    Senior Member
    Joined
    Oct 2009
    Posts
    340
    Grinding through a non-linear least squares problem efficiently isn't a really a statistical problem per se. That being said, I'm pretty sure you can find this via iteratively re-weighted least squares, or else just straight-forward Newton's Method.

    If you want to do it by hand, that's all I have to offer, but if my thinking is correct the least squares solution should be the same as the one you get if you fit a GLM with gaussian errors and a log link function, which is something I can do quickly, so I did. The solution I got was

    \theta = (0.00056575, 1.49509)
    Last edited by mr fantastic; February 24th 2011 at 05:41 PM.
    Follow Math Help Forum on Facebook and Google+

  3. #3
    A Plied Mathematician
    Joined
    Jun 2010
    From
    CT, USA
    Posts
    6,318
    Thanks
    5
    Awards
    2
    If you're doing least squares, how about the following. Let \{x_{j},y_{j}\} be your data set, and define

    \displaystyle E:=\sum_{j=1}^{N}(y_{j}-ax_{j}^{b})^{2}.

    We compute the partial derivatives of E with respect to a and b, hoping that we'll be able to set them equal to zero, and thus solve for a and b. We get

    \displaystyle\frac{\partial E}{\partial a}=-2\sum_{j=1}^{N}\left[(y_{j}-ax_{j}^{b})x_{j}^{b}\right], and

    \displaystyle\frac{\partial E}{\partial b}=-2a\sum_{j=1}^{N}\left[(y_{j}-ax_{j}^{b})x_{j}^{b}\ln(x_{j})\right].

    Setting

    \displaystyle\frac{\partial E}{\partial a}=0 yields that

    \displaystyle a=\frac{\sum_{j=1}^{N}y_{j}x_{j}^{b}}{\sum_{j=1}^{  N}x_{j}^{2b}}.

    Setting

    \displaystyle\frac{\partial E}{\partial b}=0 yields that

    \displaystyle\sum_{j=1}^{N}y_{j}x_{j}^{b}\ln(x_{j}  )-\left[\frac{\sum_{k=1}^{N}y_{k}x_{k}^{b}}{\sum_{k=1}^{N}  x_{k}^{2b}}\right]\sum_{j=1}^{N}x_{j}^{2b}\ln(x_{j})=0.

    Solve this second equation numerically for b, then plug the result into the equation for a, and you're done.

    Caveat: I have no idea how numerically stable your algorithm for solving for b is going to be, but it does seem straight-forward that once you have solved for it, a should be straight-forward.

    Does this make sense?

    And yes, theodds is right: this question probably belongs in the Analysis, Topology, and Differential Geometry forum. Or you could also put it in either Advanced Applied Math or in Other Advanced Topics.
    Follow Math Help Forum on Facebook and Google+

  4. #4
    Newbie
    Joined
    Aug 2010
    Posts
    8
    First of all, thanks for your replies.

    You may be right about the forum placing. To be honest I didn't really know where to put the question so, sorry for that.

    theodds, the values you came up with seem pretty close to the ones I'm looking for. I'm not really familiar with the method you're mentioning, but I'll see if i can get the same results.

    Ackbeet, I followed some pretty similar steps to get to the the initial values I mentioned above, but they come out with significant errors, that's why I went for the Gauss-Newton method.

    Anyways, thanks a lot for your replies.

    Cheers.
    Last edited by mr fantastic; February 24th 2011 at 05:40 PM.
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. Linear least squares with constraints
    Posted in the Advanced Algebra Forum
    Replies: 0
    Last Post: November 27th 2011, 05:55 PM
  2. Linear Regression, Residual Sum of Squares
    Posted in the Advanced Statistics Forum
    Replies: 0
    Last Post: November 25th 2011, 01:16 PM
  3. Help to formulate a non linear least squares problem
    Posted in the Advanced Applied Math Forum
    Replies: 0
    Last Post: August 20th 2011, 02:51 AM
  4. linear algebra least squares solution
    Posted in the Advanced Algebra Forum
    Replies: 0
    Last Post: August 16th 2010, 10:41 PM
  5. Help with partial derivatives for linear least squares
    Posted in the Advanced Applied Math Forum
    Replies: 1
    Last Post: November 25th 2008, 09:48 AM

Search Tags


/mathhelpforum @mathhelpforum