Basic steps of NLLS regression

Hi,

I am having trouble implementing the most basic NLLS algorithm. I am following the example from here to every detail, yet my implementation keeps on diverging after 5 or so iterations. I will give an outline of what I'm doing below.

The example tries to fit the data to a Gaussian function, so the paramters for my model are $\displaystyle params = [A, x_{o}, \sigma]$.

**Step 1**

Make an initial guess of the paramters

$\displaystyle A^0 = 0.8$

$\displaystyle x_{o}^0 = 15$

$\displaystyle \sigma^0 = 4$

**Step 2**

Compute the values of the Jacobian using the current paramter values for each data point $\displaystyle i=1,2,...,m$

$\displaystyle J_{i,A} = \exp^{-((x - x_{0}^2)/2\sigma^2)}$

$\displaystyle J_{i,x_{0}} = A(x - x_{0})\exp^{-((x - x_{0}^2)/2\sigma^2)}/\sigma^2$

$\displaystyle J_{i,\sigma} = A(x - x_{0})\exp^{-((x - x_{0}^2)/2\sigma^2)}/\sigma^3$

**Step 3**

Compute the value of the current model approximation

$\displaystyle \hat{f} = A\exp^{-((x - x_{0}^2)/2\sigma^2)}$

Error between the data set and the current model

$\displaystyle \Delta\beta = y - \hat{f}$

If the error is large, update the paramter values as follows

$\displaystyle \Delta{params} = (J^T J)^{-1} (J^T \Delta\beta)$

$\displaystyle params^{k+1} = params^k + \Delta{params}$

Loop back to **Step 2**.

If my pseudo-code has confused things, please ignore it and just describe how such a basic NLLS algorithm *should* be implemented.

Thanks! :D

Re: Basic steps of NLLS regression

Hey algorithm.

I don't have deep working knowledge of these kinds of solution finders, but you might want to look at something that has been formalized and looked at over the past couple of decades, the EM algorithm is a good one to look into:

Expectation

In terms of your actual issues, this is going to (if your code has been correct) be an issue of looking into the stability of your various scheme.

In terms of Normal distribution fitting, the EM has a lot of implementations to do this including in packages like R and if you can look at the R code for this algorithm, then you can get a much better idea of what's going on and compare it to your own implementation:

The R Project for Statistical Computing

Now a search on the implementation brought up a thread like this and after reading this I am hesitant to offer specific advice, so I will just post the thread which should give you food for thought:

R help - EM algorithm

For a normal model (as in normal PDF), there was a paper that implemented EM in R that uses bootstrapping in combination with the EM:

http://homepage.stat.uiowa.edu/~kcow...7/chang166.pdf

Re: Basic steps of NLLS regression

Hi chiro,

Thanks for your input and the links.

What I must use specifically is NLLS, and what's more I need to generalise it to non-Gaussian functions. The reason I am testing using a Gaussian function is because I know what the solution should look like if the algorithm works.

Re: Basic steps of NLLS regression

Did you run a search on non-linear optimization code examples (specific implementations) that do this?

There is bound to be one out there for this purpose (and many statistical packages do this specifically for a variety of distributions).

Google turned up this:

Levenberg-Marquardt in C/C++