Thank you in advance for your help. I'm having trouble with the setup of this problem. The problem states:
The loop current I in a series RL circuit with constant voltage E0 satisfies LI' + RI = E0 (where I' is the first derivative of
I) by Kirchhoff's voltage law. Assume that R and L are constants. Asume that initially there is no current in the circuit.
a) Find the current as a function of time.
b) Find the steady state solution.
c) When will the current be (1-e^-1) times the steady state current?