Wordy question, about rate of temperature increase.

A tank contains water which is heated by an electric water heater working under the the action of a thermostat. The temperature of the water, , may be modelled as follows. When the water heater is first switched on, . The heater causes the temperature to increase at a rate per second, where is a constant, untill . The heater then switches off.

i)Write down, in terms of , how long iit takes for the temperature to increase from to .

The temperature of the water then immediately starts to decrease at a variable rate per second, where is a constant, untill .

ii)Write down a differential equation to represent the situation as the temperature is decreasing.

iii)Fi nd the total length of time for the temperature to increase from to and then decrease to 40degreesC. Give your answer in terms of and

Long question has confused me. Please show me how to do it.

Thanks