This problem doesn't sound quite right to me. A computer or computing system is not usually thought of as an investment in terms of depreciation. I guess computers are sold second hand but the real value of a computer to a business is whatever value using it gives the company by whatever the company does. The computer will have less monetary value over time if it was to be sold to anyone who would buy it, but the value it gives to the company doesn't depreciate the way the value of it in this way does. Does that make sense? Maybe you could explain something I'm missing.
Basically f(t) and g(t) as described do not sound like they are measuring the same thing.