
Algorithm Run Times
Hey!
I am reading a book about algorithms and I'm stuck on one of the exercises where you have to calculate the maximum input of algorithms with different run times in different units of time.
For example:
For each function $\displaystyle f(n)$ and time $\displaystyle t$ , determine the largest size $\displaystyle n$ of an input that can be solved in time $\displaystyle t$, assuming the algorithm takes $\displaystyle f(n)$ microseconds.
Then the table shows the following runtime functions:
$\displaystyle \log {n}$
$\displaystyle \sqrt{x}$
$\displaystyle n$
$\displaystyle n \log {n}$
$\displaystyle n^2$
$\displaystyle n^3$
$\displaystyle 2^n$
$\displaystyle n!$
and the time values of a second, minute, hour, day, month, year and century. So basically you have to calculate the maximum input to each function so that it is less than or equal to the unit of time if i understand correctly.
I was wondering if anyone has any suggestions of how to create a good formula for working the values out for each function/unit of time?
Any help/direction would be greatly appreciated! :D
BIOS

Okay i've had a crack at the first one which is the maximum input for the log of n such that the result is <= 1 second.
I started with finding the log of a value that will equal 1.0.
$\displaystyle \log 10 = 1.0$
Then i look to find out what value n i can have so that it equals 10.
Each single value of $\displaystyle n=10^{6}$
I need to get $\displaystyle 10^1$
so i do the following:
$\displaystyle (10^7)(10^{6})=10^1$
The log of this is $\displaystyle 1.0$
Thus the greatest value i can have for n when the run time is the log of n and must not be greater than $\displaystyle 1.0$ is $\displaystyle 10^7$
is that right?