Hello, I was practicing some discrete math problems in my old text book (Rosen, 5th ed) and came by a prob where I didn't understand the solution.

The question is: Say that you have a permutation of integers from 1 to n (assume n is very large ie 10^100), and you have an algorithm that estimates whether the the list is in order by randomly choosing 2 integers i and j and seeing if the ith integer is less than the jth (assuming i<j). Of course, the more times it compares, the more accurate its solution will be (if it does 1000 comparisons, its more likely to correctly deduce if the list is in order than if it does 2). So basically, if it does k comparisons and finds that all of them were in the correct order it will return true, but if any one of them was out of order, it will return false.

What is the probability that the algorithm is correct, given that the list is n integers long, and the algorithm does k comparisons?