The problem is stated as follows:

A home buyer can afford to spend no more than $800/month on mortgage payments. Suppose that the interest rate is 9% and that the term of the mortgage is 20 years. Assume that interest is compounded continuously and that payments are also made continuously. Determine the maximum amount that this buyer can afford to borrow and determine the total interest paid during the term of the mortgage.

This is what I had in mind:

\frac{dP}{dt}= 800 - \frac{9}{100}(Q-P) This the total amount of the loan/mortgage paid off, and Q is the amount of the mortgage itself. So P(0)=0. And I eventually derive to the equation:

P(t)=(Q-\frac{80000}{9})(1-e^{\frac{9t}{100}})

I am not if this correct, and if it is, how would I proceed? Thanks!