Real-world utility(?) of derivatives to estimate rates of change
Pardon a perhaps naive question from someone who is teaching himself calculus.
I understand that the derivative of a function can be used to estimate things like marginal cost, marginal production etc.
Yet with spreadsheets, or even function-enabled calculators, available to most people, I wonder -- why not just use the original function with new values, guaranteeing a fully correct answer, rather than using the derivative, which only gives an approximately correct answer?
Let me give an example from my textbook. (I won't show all the work as I'm not asking how to solve it).
Suppose z, the total weekly production of widgets, is a function of the number of skilled and unskilled workers (x and y respectively) as follows:
z = f(x,y) = 12,000 + 5,000y + 10x^2y - 10x^3.
Assume x = 40 and y = 60.
I understand (or think I do; as ever, please correct me if I'm wrong) that if you want to estimate the additional number of widgets produced when one more skilled worker is added to the work force, you can determine the partial derivative of x with respect to z, or
12,000 + 20xy - 30x^2
and plug in (40,60), which gives you the approximation of 12,000.
I also understand that if you want more than an estimate, you can "nail" the answer down by simply solving for
z = f(41,60) = 12,000 + 5,000y + 10x^2y - 10x^3
The approximation provided by the partial derivative is about 5% too high.
In some cases, a 5% error is no problem. But what if the widgets in question are big-ticket items, meaning that lots of money is at stake? Five percent of a very big number is material.
So why, when Excel etc makes it so easy to calculate the same function while plugging in different values for the variables, would one even want to bother with the derivative method?