This is my first question in these forums. It is very general, for which I apologize in advance. If you think this is not the right place, I would appreciate directions to the appropriate site/forum where it would be better addressed.
My math: some linear and abstract algebra, calculus/analysis, some mathematical logic, probability/statistics. I have had some more exotic courses in my curriculum, but not in such relevance to my question here. I am (try to be) a programmer (software engineer if I try to be fancy).
Here is my problem. Many mathematical statements depend on the faithfulness of the model/theory, i.e. the compliance of the modeled entities with the axioms / the applicability of a system of constraints. But in practice this is never the case entirely and every observation provides evidence to the contrary to some extent. If I allow myself to make a profane guess, I am under the impression that even stochastic solutions use rather broad strokes, i.e. based on probability distributions. I am not sure how much qualification such solutions can subsume before quantification. (May be?)
I want to give one example just to illustrate my confusion. A shape that resembles a ball, also truly solid(?), generally is expected to have small perturbations of its surface. Consequently it may have a surface area that is considerably bigger. This is the contact surface between the physical object and its environment, which I imagine, directly determines the dynamics of many physical processes. There will be a considerable deviation. In practice, because formulas are specified with many parameters for calibration, the result will be adjusted accordingly, but not because the math portion of the theory demands it. I can similarly advocate that there is no tangent plane at any point of the surface because the secant planes approaching the exterior do not converge. (These correspond to secant constructions at ever bigger magnifications of the physical object).
My question --- Are there fields that address the subject of inaccurate modeling and its effects in detail? For example, in algebraic theory I imagine morphism-like functions between the elements of two algebraic structures. Say, instead the homomorphism between two groups, the theory studies functions that map between the group operations with certain loss of precision, or with some random error. Vector spaces, rings, fields, etc. A vector space may be allowed slight inaccuracy of the distributive law. The morphism between such pseudo-vector space and a real one could be studied. How much loss of structure results in how much loss of applicability of solutions to problems constructed in the "fully" structured space. I wonder if such studies could yield some prescriptive results. In analytic geometry, shapes and surfaces may be modified by various disturbances. Similarly in calculus/analysis functions that are not entirely smooth may be associated with smooth counterparts (or collections of smooth counterparts). What properties justify "approximate" integration and differentiation? (May be this is handled by numerical analysis to some extent.) Is it possible to study structural deviations in this highly abstract manner, or are they to be handled primarily on a case by case basis?
I realize I don't have concrete idea what I am looking for. Anything you can point me to of that nature will be appreciated.
Thanks and regards,