Originally Posted by

**ThePerfectHacker** In Economics there is a rule that if you place money in a bank and each year earn a percentage $\displaystyle R$. Then the amount of years to double your money is, $\displaystyle \frac{72}{R}$

Say you have 100 dollars, and you earn 50% each year. So after 1 year you have 150 dollars and after 2 years you have >200. So it takes 2 years to double the money. And if we use the formula we get,

$\displaystyle \frac{72}{50}=1.44$ but this is only an approximation and the real answer can only involve integer answers. So this says that 1 year is too little and that 2 is enough. So the formula works.

Now, here is the problem. Explain where this number comes from.

NOTE: This is an applied math problem you do not need to be formal, can use graphs, can use calculators, can approximate and do whatever you want. As long as you can nicely explain this phenomena.