I am having trouble getting my head around this question. please tell me if i am on the right track, and no idea how to go about part b.

the voltage across a resistor in a circuit is given by V=IR

if R= 0.010t^2 (ohms) and I= 4.12+.020t (amps) t = time in seconds

a) find the rate of change of the voltage with respect to time when t=2.5s

v= (4.12+.020t)(0.010t^2)

expanded v = 0.0412t^2 + 0.0002t^3

dv/dt = 0.0824t + 0.0006t^2

= 0.0824(2.5) + 0.0006(2.5)^2

= 0.20975 volts/second (210mv/sec)

b)at what time(s) will the voltage be a minimum?

????????????????????????? how do i turn this into an equation of a line? do i need to?