How do you find the average acceleration of an object over a certain amount of time if you are given the start and end time and the different velocities at each time?
By definition the mean value of a function $\displaystyle a(t)$ in an interval $\displaystyle t_{1}<t<t_{2}$ is...
$\displaystyle m[a(t)] =\frac {\int_{t_{1}}^{t_{2}} a(t)\cdot dt}{t_{2}-t_{1}}$
But the primitive of acceleration $\displaystyle a(t)$ is $\displaystyle v(t)$ so that...
$\displaystyle m[a(t)]= \frac{v(t_{2})-v(t_{1})}{t_{2}-t_{1}}$
Kind regards
$\displaystyle \chi$ $\displaystyle \sigma$
In some sense yes... what does matter is that the 'average value' of acceleration in the interval $\displaystyle t_{1}<t<t_{2}$ depends only from the speed at the begginning [$\displaystyle t_{1}$] and the end [$\displaystyle t_{2}$] and not from else...
Kind regards
$\displaystyle \chi$ $\displaystyle \sigma$