Originally Posted by

**VonNemo19** Hi. I'm doing my homework problem set and I came across this one:

A small metal bar, whose initial temperature is $\displaystyle 20^\circ{C}$, is dropped into a large container of boiling water. How long will it take the bar to reach $\displaystyle 90^\circ{C}$ if it is known that its temperature increases $\displaystyle 2^\circ{C}$ in $\displaystyle 1$ second? How long will it take the bar to reach $\displaystyle 98^\circ{C}$?

So, I'm sure that we use Newton's cooling law which states that the rate at which the temperature of an object increases or decreases is proportional to the difference between the object and the surrounding temperature, or

$\displaystyle \frac{dT}{dt}=k(T-T_m)\text{ where }k<0$.

So, from the problem, I can infer that (1) $\displaystyle T(0)=20$; (2) $\displaystyle T(1)=22$; and (3) $\displaystyle T_m=100$ (because this is the boiling point of water).

Newton's differential equation is seperable, so...

$\displaystyle \int\frac{dT}{T-100}=k\int{dt}$ implies that $\displaystyle T(t)=ce^{kt}+100$

Now, $\displaystyle T(0)=20\Rightarrow{ce}^{k(0)}+100=20\Rightarrow{c} =-80$

To find $\displaystyle k$ we use the fact that $\displaystyle T(1)=22$. So, $\displaystyle {-80e}^{k(1)}+100=22$.

All I want to know is if I've done anything wrong so far. So?