# General Solution of inhomogenous ODE

• Dec 29th 2009, 09:20 AM
matlabnoob
General Solution of inhomogenous ODE
ok,so again, reading my lecture notes,i bumped into this problem..

consider
A = (1 2,
0 -1)
find the general solution of
x' = Ax + ( 0,1 )

write x = (x, y) then the ODE is :
x' = x+2y
y' = -y + 1

y' = -y+1 implies y(t) = Ce^-t + 1 (how did one obtain this?? )

The ODE for x(t) is x' = x + 2(C(e^-t) + 1)
This linear inhomogeneous ODE gives x(t) = -2 - c(e^-t) + d(e^t) again how does one get this? i integrated but i cant get this. after trying for hours..i simply gave up ='(

thank you !
• Dec 29th 2009, 01:59 PM
abender
While the technique described in this post works, there is a big-time shortcut to this problem that I initially oversaw. The shortcut works when A is a constant matrix. Check the posts after this one for an explanation of this shortcut method.

The idea of variation of parameters is to seek the solution to
$\bf{x}' = A\bf{x} + \bf{b}$
in the form
$
\bf{v}_1(t)\bf{x}_1(t) + \bf{v}_2 (t)\bf{x}_2 (t) = F(t)\bf{v}(t)$

whereby $F(t) = \begin{bmatrix} x_1 & x_2 \end{bmatrix}$ is a fundamental matrix.

Note that the constant vector c of coefficients is replaced by an unknown function v(t).

What is the equation for our unknown function v(t) you ask?
It is fairly straight forward to derive:

$
(F\bf{v})' = F'\bf{v} + F\bf{v'} = A(F\bf{v}) + \bf{b} \implies
F\bf{v'} = \bf{b} \implies
\bf{v'} = F^{-1}\bf{b}$
.

IMPORTANT: You must make sure that $F^{-1}$ exists in order to solve for v !!!! (not factorial(Happy))

Finally, using integration, we have:

$
\bf{x}(t) = F(t) \int F(t)^{-1} \bf{b}(t)dt$

I'll now rewrite the original problem in latex so it is more palatable to the eye:

Find the general solution of
$
\bf{x}' = A\bf{x} + \begin{bmatrix} 0 \\ 1 \end{bmatrix}$
, whereby $A = \begin{bmatrix} 1 & 2 \\ 0 & -1 \end{bmatrix}$.

Does this help?
• Dec 29th 2009, 03:40 PM
Jester
Quote:

Originally Posted by abender
The idea of variation of parameters is to seek the solution to
$\bf{x}' = A\bf{x} + \bf{b}$
in the form
$
\bf{v}_1(t)\bf{x}_1(t) + \bf{v}_2 (t)\bf{x}_2 (t) = F(t)\bf{v}(t)$
whereby $F(t) = \begin{bmatrix} x_1 & x_2 \end{bmatrix}$ is a fundamental matrix.

Note that the constant vector c of coefficients is replaced by an unknown function v(t).

What is the equation for our unknown function v(t) you ask?
It is fairly straight forward to derive:
$
(F\bf{v})' = F'\bf{v} + F\bf{v'} = A(F\bf{v}) + \bf{b} \implies
F\bf{v'} = \bf{b} \implies
\bf{v'} = F^{-1}\bf{b}$
.
IMPORTANT: You must make sure that $F^{-1}$ exists in order to solve for v !!!! (not factorial(Happy))

Finally, using integration, we have:
$
\bf{x}(t) = F(t) \int F(t)^{-1} \bf{b}(t)dt
$
I'll now rewrite the original problem in latex so it is more palatable to the eye:

Find the general solution of
$
\bf{x}' = A\bf{x} + \begin{bmatrix} 0 \\ 1 \end{bmatrix}$
, whereby $A = \begin{bmatrix} 1 & 2 \\ 0 & -1 \end{bmatrix}$.
Does this help?

I believe this is an overkill for this particular system.

Quote:

Originally Posted by matlabnoob
ok,so again, reading my lecture notes,i bumped into this problem..

consider
A = (1 2,
0 -1)

find the general solution of
x' = Ax + ( 0,1 )

write x = (x, y) then the ODE is :
x' = x+2y
y' = -y + 1

y' = -y+1 implies y(t) = Ce^-t + 1 (how did one obtain this?? ) (*)

The ODE for x(t) is x' = x + 2(C(e^-t) + 1) (**)
This linear inhomogeneous ODE gives x(t) = -2 - c(e^-t) + d(e^t) again how does one get this? i integrated but i cant get this. after trying for hours..i simply gave up ='(

thank you !

First question (*) above. It is a separable ODE - separate and integrate.
Second question (**) is linear with the integrating factor $\mu = e^{-t}$.
• Dec 29th 2009, 03:52 PM
abender
Or perhaps I have an easier way of doing this.

We have a matrix differential equation of the form

$
\bf{x}'(t) = A\bf{x}(t) + \bf{b}(t)$

whereby $A$ is a constant matrix.

Since $A$ is a constant matrix, if we can calculate $
e^{tA}$
, then we can find the solution to the system. So, we make $e^{tA}$ an integrating factor and multiply throughout:

$
e^{-tA} \bf{x}'(t) = e^{-tA} A\bf{x}(t) + e^{-tA} \bf{b}(t)$

$\implies
e^{-tA} \bf{x}'(t) - e^{-tA} A\bf{x}(t) = e^{-tA} \bf{b}(t)$

$\implies
\frac{d}{dt} \big{[} e^{-tA} \bf{x}(t) \big{]} = e^{-tA} \bf{b}(t)$

$
\implies \bf{x}(t) = e^{tA} \big{[} \int^t_0
e^{-\mu A} \bf{b}(\mu) d\mu
\big{]}$
• Dec 29th 2009, 05:10 PM
matlabnoob
thanks!!im still trying to follow through
(Thinking)

im not v.good with things unless they're worked out directly from how they originally are.but i have to know the method anyway (Itwasntme)
• Dec 29th 2009, 05:19 PM
abender
You're welcome and I appreciate your appreciation.

Which part of the problem are having the most difficulty with? Is it the "setting things up" part? Crunching the integral? Just getting started??

-Andy
• Dec 30th 2009, 05:42 AM
matlabnoob
Quote:

Originally Posted by abender
You're welcome and I appreciate your appreciation.

Which part of the problem are having the most difficulty with? Is it the "setting things up" part? Crunching the integral? Just getting started??

-Andy

the main things im having difficult with (i still dont know! spend the night trying to figure out. aah.. maths takes me forever to solve)
.. they are :

y' = -y+1 implies y(t) = Ce^-t + 1

ok so for this i integrated just the homogeneous equation first, which i took as dy/dt + y = 0 and i got Ce^-t from that! but how did the +1 get there too? =S.... im not sure how to solve this without making errors or forgetting the +1 ..

and.. when i tried to integrate..
x(t) is x' = x + 2(C(e^-t) + 1)

i couldnt get..
x(t) = -2 - c(e^-t) + d(e^t)

i tried integrating by parts... substitution... it wont work (Doh)

• Dec 30th 2009, 06:48 AM
Jester
Quote:

Originally Posted by matlabnoob
the main things im having difficult with (i still dont know! spend the night trying to figure out. aah.. maths takes me forever to solve)
.. they are :

y' = -y+1 implies y(t) = Ce^-t + 1

ok so for this i integrated just the homogeneous equation first, which i took as dy/dt + y = 0 and i got Ce^-t from that! but how did the +1 get there too? =S.... im not sure how to solve this without making errors or forgetting the +1 ..

and.. when i tried to integrate..
x(t) is x' = x + 2(C(e^-t) + 1)

i couldnt get..
x(t) = -2 - c(e^-t) + d(e^t)

i tried integrating by parts... substitution... it wont work (Doh)

First ODE $\frac{dy}{dt} = - (y-1)$ so $\frac{dy}{y-1} = - dt$ so $\ln | y-1 | = - t + \ln c$. Exponentiating gives $y = ce^{-t} + 1$.

Second ODE $\frac{dx}{dt} - x = 2\left(c e^{-t} + 1 \right)$. The integrating factor is

$\mu = e^{-t}$ so $e^{-t} \left( \frac{dx}{dt} - x\right) = e^{-t} 2\left(c e^{-t} + 1 \right)$.

Thus, $\frac{d}{dt} \left( x e^{-t} \right) = 2c e^{-2t} + 2 e^{-t}$ and integrating gives $x e^{-t} = -c e^{-2t} - 2 e^{-t} + d$. Isolating x gives $x = -c e^{-t} - 2 + de^{t}$
• Dec 30th 2009, 07:16 AM
abender
Good morning.

The question asks us to find the general solution to

$
\boldsymbol{x}'(t) = A\boldsymbol{x}(t) + \boldsymbol{b}(t)
$

whereby

$A = \begin{bmatrix} 1 & 2 \\ 0 & -1 \end{bmatrix}$ , $\boldsymbol{b}(t) = \begin{bmatrix} 0 \\ 1 \end{bmatrix}$ .

Just in case this it is a source of confusion for you, I'll clarify that \boldsymbol{x} can be (and in this post, is) expressed as follows:

$
\boldsymbol{x}(t) = \begin{bmatrix} x \\ y \end{bmatrix}
$
.

Now I will generalize much less and try to spell out each step and computation for you. It will take a bit longer, but perhaps you need to see things this way.

We begin by plugging in what we know into
$
\boldsymbol{x}'(t) = A\boldsymbol{x}(t) + \boldsymbol{b}(t)
$
.

We get

$\begin{bmatrix} x' \\ y' \end{bmatrix} =
\begin{bmatrix} 1 & 2 \\ 0 & -1 \end{bmatrix}
\begin{bmatrix} x \\ y \end{bmatrix} +
\begin{bmatrix} 0 \\ 1 \end{bmatrix} =
$
$
\begin{bmatrix} x + 2y \\ -y \end{bmatrix} +
\begin{bmatrix} 0 \\ 1 \end{bmatrix} =
$
$
\begin{bmatrix} x+2y \\ -y +1 \end{bmatrix}$
.

So we have

$\begin{bmatrix} x' \\ y' \end{bmatrix} =
\begin{bmatrix} x+2y \\ -y +1 \end{bmatrix}$
.

Now we multiply both sides by the integrating factor $e^{-tA}$ .

$
e^{-tA} \begin{bmatrix} x' \\ y' \end{bmatrix} =
e^{-tA} \begin{bmatrix} x+2y \\ -y +1 \end{bmatrix}
$

Now, let's explain $e^{-tA}$ , which is called a matrix exponential. I'll give a general definition:

Let M be a an n x n matrix. $e^M$ is the matrix exponential of M , which is expressed by the power series

$e^{M} = \sum_{k=0}^\infty \frac{1}{k!} M^k$ .

Note that the exponential of M is well-defined, as the series always converges.

Are you following so far?