Please help!
Given:
y'(t) + a(t).y(t) = f(t), with "a" and "f" continuos in R
a(t) >= c > 0
lim (t->oo) f(t) = 0
Demonstrate:
Any solution y(t), verifies lim (t->oo) y(t) = 0
Thank You.
Here is a hint to get you started:
The integrating factor for this equation is
$\displaystyle I(t)=e^{\int_{b}^{t}a(x)dx} $
Multiplying by this and writing the left hand side as a derivative gives
$\displaystyle \frac{d}{dt}\left(y(t)\cdot e^{\int_{b}^{t}a(x)dx}\right)=e^{\int_{b}^{t}a(x)d x}f(t)$
Now use some of the assumptions to bound the solution to the equation.
How did you get that?
If you integrate what I gave you above you should get
$\displaystyle y(t)\cdot e^{\int_{b}^{t}a(x)dx}=\int_{c}^{t}e^{\int_{b}^{y} a(x)dx}f(y)dy$
Solving for $\displaystyle y(t)$ gives
$\displaystyle y(t)=\frac{\int_{c}^{t}e^{\int_{b}^{y}a(x)dx}f(y)d y}{e^{\int_{b}^{t}a(x)dx}}$