program problem about primality test

the problem is, devise the most efficient algorithm to determine primality <1000000. show this by comparing the times taken by different algorithms.

i made a trial division, fermat test, euler test, euler-jacobi test, miller rabin test.

and i think a trial division + miller rabin will be the most efficient.

so i ran my program < 100000 and timed it , it costs 20 sec.

for miller rabin base 2 and base 3 , the time is 50 sec.

BUT, for those other methods, their times are less than 5 sec.

So how could trial division + miller rabin be the most efficient way?