At what rate is revenue rising or falling.
"For the past few months, OHaganBooks.com has seen its weekly sales increase. At the moment OHaganBooks.com is selling 1000 books per week and its sales are rising at a rate of 200 books per week. Also, it is now selling all its book for $20 each, but the price is dropping at a rate of $1 per week. I need to know at what rate OHaganBooks.com's revenue is rising or falling given these conditions. I would also like to see that company's revenue increase at a rate of $5,000 per week. At what rate would sales have to have been increasing to accomplish this?"
So those are the two questions I have to answer. Pretty simple I'm guessing. I understand (R=pq) which is not hard to calculate for each week. I put the answers out in a table showing the difference in revenue from the previous week, and the total revenue for the week.
However, I'm guessing there is some calculus technique I need to use here and I'm not quite sure what it is. If anyone could take a look at it and see what they think I would greatly appreciate it.