Wordy question, about rate of temperature increase.

A tank contains water which is heated by an electric water heater working under the the action of a thermostat. The temperature of the water, $\displaystyle \theta degrees C$, may be modelled as follows. When the water heater is first switched on, $\displaystyle \theta=40$. The heater causes the temperature to increase at a rate $\displaystyle K1 degrees C$ per second, where $\displaystyle K1$ is a constant, untill $\displaystyle \theta=60$. The heater then switches off.

i)Write down, in terms of $\displaystyle K1$, how long iit takes for the temperature to increase from $\displaystyle 40degreesC$ to $\displaystyle 60degreesC$.

The temperature of the water then immediately starts to decrease at a variable rate $\displaystyle K2(\theta-20)degreesC$ per second, where $\displaystyle K2$ is a constant, untill $\displaystyle \theta=40$.

ii)Write down a differential equation to represent the situation as the temperature is decreasing.

iii)Fi nd the total length of time for the temperature to increase from $\displaystyle 40degreesC$ to $\displaystyle 60degreesc$ and then decrease to 40degreesC. Give your answer in terms of $\displaystyle K1 $ and $\displaystyle K2$

Long question has confused me. Please show me how to do it.

Thanks