i have the followin question and i am not sure how to solve it :
The table of data below was produced by students who did the leaking-faucet experiment. The measuring container they used held only 100 milliliters. If the students had continued their experiment, after how many seconds would the measuring container have overflowed?
Time: 10 20 30 40 50 60 70
Water loss (milliliters) 2 5 8.5 11.5 14 16.5 19.5
thank you in advance for your help.
Sep 6th 2005, 02:19 AM
I'd solve it this way.
The difference in volumes of leaks every 10 seconds is 3, 3.5, 3, 2.5, 2.5 mL for the first 60 sec. Then from 60th to 70th sec, the leak is 3 mL again.
I see that as a repitition every 60 seconds, because the leak is 3 mL again in the 60th to 70th sec.
So, the total leak after 60 sec = 16.5 mL
Per mL, leak = 60sec / 16.5mL = 3.63636... = (3.6 / 99) sec/mL.
Total volume of measuring container = 100mL
(100)*(3.6/99 sec/mL) = 360sec / 99mL
That means after 360 sec, the total leak is 99 mL.
Now, for the last 1 mL before the measuring container overflows.
We know it takes 10sec for a 3mL leak at the start of a 60-sec cycle.
(10sec / 3mL)*(1mL) = 3.33 sec
Therefore, after (360 +3.33) = 363.33 seconds, the 100-mL measuring container will overflow. -----answer.