Originally Posted by

**topsquark** So I'm working on this Physics problem, which (alas!) turns into an unsolvable Math problem. I have the equation (as a function of t):

$\displaystyle e^{at} = a^{\prime} t + b^{\prime};~~0 \leq t \leq T$

where a is presumed known.

The question boils down to how to find reasonable values for a' and b' that best model the above equation. (I can't simply do a Taylor Expansion on the left hand side because T could be large.) Now, being the Physicist I am I would know how find an estimate if the equation were

$\displaystyle e^{at} = b^{\prime};~~0 \leq t \leq T$

The "best fit" in this case would be to find the average of the exponential function as t goes from 0 to T.

Is there a way to think of this equation as a "best fit" problem by using the exponential function as if it were "data?"

-Dan