What is the difference between "Average Rate of Change" and "Instantaneous Rate of Change"?
The time we measure over.
Average is over the given time $\displaystyle \Delta t$ in which an event starts and finishes. This could be anything from a year to 10 seconds.
The instantaneous time is the rate at any given time over an infinitesimally small amount of time. As $\displaystyle t \rightarrow 0 \: , \: \Delta t \rightarrow dt$
Suppose you drive from New York City to Chicago at an average 65 mph- that is the total distance divided by the time it took is 65 mph. If you look down at your speedometer at some time on the trip would you think it would have to show exactly 65 mph? No, at that particular instant you might be going above 65 mph or below that. The first is the "average", the second is the "instantaneous" speed.
(Strictly speaking, what your speedometer shows is the average over one turn of the wheels but that is a heck of a lot closer to "instantaneous" than the entire trip!)