Hello everyone,

I am looking to find out where I am going wrong on the following problem.

If a ball is thrown up into the air with an initial velocity of 40 ft/sec, its height in feettseconds later is given byy=40t-16t^2.

a. Find the average velocity for the time period beginning att=1and lasting for .

1.) 0.5 sec

2.) 0.1 sec

3.) 0.05 sec

4.) 0.01 sec

I know that average velocity is equal to change in height/change in time so this how I set it up for the first case.

{((40(1.5)-16(1.5^2)-40(1)-16(1^2))}/(1.5-1)

This gives me an answer of 0 ft/sec which doesn't seem to make sense. Can somebody point out where I am going wrong. BTW I did the other calculations the same way and they come out to "better" numbers but I still wonder if I am not making a fundamental error.

Thanks,

juventinoalex