Suppose that an algorithm uses bit operations to solve a problem of size .
Suppose that your machine can perform one bit operation in seconds, how long does it take your algorithm to solve a problem of size given below.
Note, if your algorithm takes more than 60 seconds, answer in minutes. For more than 60 minutes, answer in hours. For more than 24 hours, answer in days. For more than 365 days, answer in years. For more than 100 years, answer in centuries.
Our class has been taught nothing so far concerning algorithm size, so I really need help on this. I've listed the question word for word.