Regression by RMS method: Improving fit.
I'm looking for some help with a regression problem.
I have written a macro for Excel which fits a logarithmic function to a time series data set. The function I am using is:
Y’(x) = a*Ln(x + b) + c
Where a, b and c are constants, x is the julian date (days after 1st Jan 1900).
I input a guess for a, b and c the program than calculates the Sum of the Root Mean Squared residuals:
SSR = √[∑(Y(x) – Y’(x))^2/n]
Where Y(x) is the measured data at time x, and Y’(x) is the fitted value. n is the number of data points.
Excel then uses its Solver functionality to vary a, b and c to minimise SSR.
The problem is when I run this macro the fits are very poor. By eye the data looks vaguely curved, possibly logarithmic. However, the fitted functions look linear and do not closely represent the data.
I have speculated that the Solver is finding a local minimum of SSR, and not searching any further. In this case, could anyone suggest a way that I might improve my initial guesses of a, b and c to improve the chances of getting a closer fit?
Otherwise could anyone suggest an alternative (improved?) regression method?
I can show my code and/or a sample of the data if anyone is interested.
Thanks in advance (Happy)