This is a trick question . . .
A jet pilot plans to cover a 1000-mile course at an average speed of 1000 mph.
For the first 800 miles, the speed is 800 mph.
At what rate must the remaining distance be covered?
He plans to fly 1000 miles at an average of 1000 miles per hour.
. . So his flight will take exactly one hour ... right?
He flies the first 800 miles at 800 miles per hour.
. . This takes one hour.
This leaves no time for him to fly the other 200 miles.
[Maybe he can say, "Beam me over, Scotty" ?]