I am looking to find out where I am going wrong on the following problem.
If a ball is thrown up into the air with an initial velocity of 40 ft/sec, its height in feet t seconds later is given by y=40t-16t^2.
a. Find the average velocity for the time period beginning at t=1 and lasting for .
1.) 0.5 sec
2.) 0.1 sec
3.) 0.05 sec
4.) 0.01 sec
I know that average velocity is equal to change in height/change in time so this how I set it up for the first case.
This gives me an answer of 0 ft/sec which doesn't seem to make sense. Can somebody point out where I am going wrong. BTW I did the other calculations the same way and they come out to "better" numbers but I still wonder if I am not making a fundamental error.