If we have algorithm $\displaystyle A$ with time complexity: $\displaystyle N^2$ and algorithm $\displaystyle B$ which does the same thing but with time complexity of $\displaystyle N\log_2{N}$

If $\displaystyle N = 5000$ and algorithm $\displaystyle A$ takes 10 seconds to solve the problem, how much time (in seconds) would algorithm $\displaystyle B$ take?

Is my solution correct: $\displaystyle 10\frac{5000\log_2{5000}}{5000^2}$?