Originally Posted by

**kcsteven** I have a problem that I am stuck on, The problem states: If a ball is thrown straight up into the air with an initial velocity of 90 ft/s, its height in feet after t seconds is given by y = 90t - 16t^2. Find the average velocity for the time period begining when t = 1 and lasting:

a) 0.1 sec. over the time interval [1, 1.1]

b.) 0.01 sec.

c.) 0.001 sec.

Finally based on the above results, guess what the instantaneous velocity of the ball when t = 1.

I set it up( 90(1.1) - 16(1)^2)/0.1

I did the example in the book and this one is a little different because the ball is going up, the other question has the ball dropped from a certain height. I did take t=1 and plug it in the equation and got 99-16 = 83/.01 and this was not correct. I tried using a form of the difference quotient but not very successful. Let me know if one of these ideas is the right way to go. It is possible I just made a simple mistake and I have made the same mistake over and over again and did not catch on until I asked for help, so if you could please give me a hint, I would appreciate it!

Thank you,

Keith Stevens