Hi. I have a certain applied math problem I'm pondering, but the core question seems to be more meta, so I'll ask it in a more general form. In brief, I'm wondering if calculus can be used to find the solution to a problem when the problem is solved by an infinite repetition of function application.
In my problem, there is
1. A function f(x), which provides an approximated solution to the problem.
2. x0 which is the original input.
If I repeatedly apply f, as in
or (f . f . f . f...) x
I seem to slowly get closer to the exact solution. It final solution seems to be the limit of some kind of function which uses repeated applications of f(), as the number of those applications approaches infinity. So, is this the sort of problem that calculus can help you with? Or is it fundamentally incompatible with calculus because of the discrete element to the problem? (I.e., the repeated function application being the discrete element.)