Numerical Iterations -- Predictions to achieve convergence
Using the equation:
with bounds a = 0 and b = 1; I was able to determine the arc-length of my function . This was done using numerical integration with enough sub-intervals to have consistent digits in the hundredths column. Knowing S I want to divide up my function into equal sub-intervals of length L then I can say for the first section that:
and iterate through b to get a consistent digit in the hundredths column. This is extremely time consuming. It would be beneficial if I could integrate this equation by hand but because of the high polynomials that are incurred in y = f(x) numerical integration must be done. Is there an optimum way to guess b (similar to shooting method) to possibly get a solution in a few iterations?