Calculating rate of voltage change through a circuit.

Here's the question:

Voltage across a circuit is given by Ohm's law, V=IR, where I=the current & R= resistance.

If we place two circuits with resistance R1 and R2 in parallel their combined resistance, R, is given as

Suppose the current is 2 amps and increasing at amp/sec and is 3 ohms and increasing at 0.5 ohm/sec, while is 5 ohms and decreasing at 0.1 ohm/sec.

Calculate the rate at which the voltage is changing.

This is what I've got:

Need to find

Using the Chain Rule, I get this:

where

and

and

and

Which all comes to:

and that's as far as I've got! How do I find the change in resistance? I've tried a few ways but none of them seem satisfactory. any ideas?