This is the problem:

Two algorithms takes n^{2}days and 2^{n}seconds respectively, to solve an instance of size n.

What is the size of the smallest instance on which the former algorithm outperforms the

latter algorithm?

I was able to get the answer by solving both functions f(n)=86400*n^{2}and g(n)=2^{n}(86400 since there are that many seconds in a day) for values of n starting with 1 and incrementing. At 26, algorithm 1 becomes more efficient than algorithm 2. I can also get at the same answer by graphing both functions. But I'm sure there has to be another way. I've tried Googling and found information on comparing the growth rates of two functions using limits, but none of those techniques allow me to calculate the minimum n where the slower-growing algorithm becomes more efficient than the faster-growing one. Thanks for any help anyone can give me!