Hello,

I'm having hard times with the following simple linear ODE coming from a control problem:

u(t)' \leq \alpha(t) - u(t)\,,\quad u(0) = u_0 > 0

with a given smooth \alpha(t) satisfying

0 \leq \alpha(t) \leq u(t) for all t\geq 0.

My intuition is that \lim_{t\to\infty} u(t) - \alpha(t) = 0, and that the convergence is exponential, i.e., |u(t) - \alpha(t)| = u(t) - \alpha(t) \leq c_1 e^{-c_2 t}.
For instance, if \alpha was a constant, then the exponential convergence clearly holds.
Do you see a simple proof for time-dependent \alpha (I'm getting grey, but could not prove it), or is my intuition wrong?

Many thanks, Peter