Originally Posted by

**rhobere** word problems. awesome

Water flows through a purifying machine at the rate 50 gal/min. The machine removes 10% of the impurities as the water flows through. Suppose I have 1000 gallons of water containing 5 gallons of impurities. How long do I have to run it through the machine to reduce the amount of impurity to half a gallon?

So I tried setting this up in the form dx/dt + r_out*(x/V) = r_in*c_in, where r_in and r_out are 50 gal/min, V is 1000 and my initial condition is x(0) = 5 gal/V. The problem is, I don't know what to use for c_in (concentration of the inflow).

I'm pretty sure the problem wants me to use the integrating factor method and I'm comfortable doing so, I just can't figure out how to set up the problem in its entirety. Anybody want to help me figure out how to find c_in?