Originally Posted by

**tamzam600** Hello,

My question seems to be simple. I would like to **numerically** solve the following first order ODE to obtain v(x):

v'(x) = b.[v(x) - f(x)] , given boundary condition v(+infinity) = 0 [b is a known constant]. OR equivalently realize operator b/(b-D) , where D is d/dx

These are the problems:

1)) f(x) is not known explicitly (f(x) is sampled at a dense x grid).

2)) f(x) has a discontinuity, i.e. there is a large jump in df(x)/dx

3)) This is a [final value problem], so I need an algorithm that works backwards instead of the usual forward Runge-Kutta algorithms

I know that using the integrating factor, we obtain the following solution

v(x) = b.exp(b.x) . integral(from -infinity to x) [exp(-b.y).f(y) dy ].

However, given we have a dense x grid, this means I need to evaluate the above integral many times, which is computationally very slow.

Can you please help me? any hints for solving the ODE or realize the (b/(b-D)) operator will be highly appreciated...