If \(\displaystyle N = 5000\) and algorithm \(\displaystyle A\) takes 10 seconds to solve the problem, how much time (in seconds) would algorithm \(\displaystyle B\) take?

Is my solution correct: \(\displaystyle 10\frac{5000\log_2{5000}}{5000^2}\)?

- Thread starter Nforce
- Start date