hi, i cant figure out this equation and its making me nuts, a man gets 1 cent for the first day, it doubles everyday for 34 days................................next part, when would be become a millionaire?
So sub n = 34 and solve for M.
Sub M = (however many cents are in one million dollars) and find the first integer valuem of n such that you've got at least a million dollars. Trial and error might be the best appoach here ..... Hint: From the first part you know n < 34 ......)
A man gets 1 cent for the first day, it doubles everyday for 34 days.
(a) How much will he have at the end of the period?
(b) When would be become a millionaire?
He gets: . cents.
This is a geometric series with first term 1 and common ratio 2.
The sum is: . cents.
(a) He will have a total of: .
When does cents?
We have: .
Take logs: .
(b) He becomes a millionaire on the 27th day.