Please help! Given: y'(t) + a(t).y(t) = f(t), with "a" and "f" continuos in R a(t) >= c > 0 lim (t->oo) f(t) = 0 Demonstrate: Any solution y(t), verifies lim (t->oo) y(t) = 0 Thank You.
Follow Math Help Forum on Facebook and Google+
Originally Posted by Pedro Please help! Given: y'(t) + a(t).y(t) = f(t), with "a" and "f" continuos in R a(t) >= c > 0 lim (t->oo) f(t) = 0 Demonstrate: Any solution y(t), verifies lim (t->oo) y(t) = 0 Thank You. Here is a hint to get you started: The integrating factor for this equation is Multiplying by this and writing the left hand side as a derivative gives Now use some of the assumptions to bound the solution to the equation.
Originally Posted by TheEmptySet Now use some of the assumptions to bound the solution to the equation. I've tried doing that. Maybe I'am missing something, but I'll get: Lim [ d/dt(y(t) * e^int{a(t).dt}) ] = 0, which gives that the integrand is constant, C = y(t) * e^int{a(t).dt} How can i reach that "y(t) = 0" when "t-->oo"?
Originally Posted by Pedro I've tried doing that. Maybe I'am missing something, but I'll get: Lim [ d/dt(y(t) * e^int{a(t).dt}) ] = 0, which gives that the integrand is constant, C = y(t) * e^int{a(t).dt} How can i reach that "y(t) = 0" when "t-->oo"? How did you get that? If you integrate what I gave you above you should get Solving for gives
View Tag Cloud