Business: Total Savings. A company installs a new computer that is expected to generate savings at the rate 20,000e^-0.02t dollars per year, where t is the number of years that the computer has been in operation. If the computer originally cost $230,000, when will it "pay for itself"? Round answers to the nearest tenth. [Hint: Find the indefinite integral, solve for the constant C.]
I keep coming down to the end of it (or so it seems) and getting e^-.02t+C=-.23
I know I'm doing something wrong, I just can't figure out where. Thanks in advance.