Let x be the amount of time he waits before buying. How much time until the problem is solved? It's x + something less than 96 months (8 years), right?
I have the following problem:
A researcher has funds to buy enough computing power to number-crunch a problem in 8 years. Computing power per dollar doubles every 23 months.
When should he buy his computers to have finished the problem as soon as possible? Give your answer as a decimal in months, accurate within 0.1 months.
Suppose the problem would take C months on current computers. What is the largest value of C for which he should buy the computers immediately? Give an answer as a decimal accurate to 0.1 months.
I have no idea of where to start with this one?
Could anybody help me?