we have Algorithm A whose runtime is f(n) =
and Algorithm B whose runtime g(n) =
Assume we want to write a library function that is given an input of size n,and then selects which of the 2 algorithms to use based on which gives the best runtime for that size.Formally determine the cuttoff for n when the library should stop using one algorithm and should start using the other.Show your work and justify your answer.
Please help
CB thanks for your reply.I have a question.SO If i find the positive root of f(n)=g(n) which is 0.90228 and from my limit calculations of the growth rate of the 2 functions and using solving them using the quantified definitions,I figured that f(n) is in the theta of g(n). That means both function grow at the same rate.How can then you say that for any value of n over 0.90228 we could use B and below it we could use A?
They grow asymtotically "at the same rate", or rather one is bounded above by a multiple of the other and below by a different multiple of it. Thus asymtotically we could have where is a constant. If K is anything other than 1 then one of these is eventually always smaller than the other and so is the run time of a more efficient algorithm.
In this case A is 32 times slower than B asymtotically.
CB