I am reading a book about algorithms and I'm stuck on one of the exercises where you have to calculate the maximum input of algorithms with different run times in different units of time.
For each function and time , determine the largest size of an input that can be solved in time , assuming the algorithm takes microseconds.
Then the table shows the following run-time functions:
and the time values of a second, minute, hour, day, month, year and century. So basically you have to calculate the maximum input to each function so that it is less than or equal to the unit of time if i understand correctly.
I was wondering if anyone has any suggestions of how to create a good formula for working the values out for each function/unit of time?
Any help/direction would be greatly appreciated! :D