Suppose you drive from New York City to Chicago at an average 65 mph- that is the total distance divided by the time it took is 65 mph. If you look down at your speedometer at some time on the trip would you think it would have to show exactly 65 mph? No, at that particular instant you might be going above 65 mph or below that. The first is the "average", the second is the "instantaneous" speed.
(Strictly speaking, what your speedometer shows is the average over one turn of the wheels but that is a heck of a lot closer to "instantaneous" than the entire trip!)