Results 1 to 9 of 9

Math Help - General Solution of inhomogenous ODE

  1. #1
    Member
    Joined
    Nov 2009
    Posts
    75

    General Solution of inhomogenous ODE

    ok,so again, reading my lecture notes,i bumped into this problem..

    consider
    A = (1 2,
    0 -1)
    find the general solution of
    x' = Ax + ( 0,1 )

    write x = (x, y) then the ODE is :
    x' = x+2y
    y' = -y + 1

    y' = -y+1 implies y(t) = Ce^-t + 1 (how did one obtain this?? )

    The ODE for x(t) is x' = x + 2(C(e^-t) + 1)
    This linear inhomogeneous ODE gives x(t) = -2 - c(e^-t) + d(e^t) again how does one get this? i integrated but i cant get this. after trying for hours..i simply gave up ='(

    thank you !
    Follow Math Help Forum on Facebook and Google+

  2. #2
    Senior Member
    Joined
    Mar 2008
    From
    Pennsylvania, USA
    Posts
    269
    Thanks
    37
    While the technique described in this post works, there is a big-time shortcut to this problem that I initially oversaw. The shortcut works when A is a constant matrix. Check the posts after this one for an explanation of this shortcut method.


    The idea of variation of parameters is to seek the solution to
    \bf{x}' = A\bf{x} + \bf{b}
    in the form
     <br />
\bf{v}_1(t)\bf{x}_1(t) + \bf{v}_2 (t)\bf{x}_2 (t) = F(t)\bf{v}(t)

    whereby  F(t) = \begin{bmatrix} x_1 & x_2 \end{bmatrix} is a fundamental matrix.

    Note that the constant vector c of coefficients is replaced by an unknown function v(t).

    What is the equation for our unknown function v(t) you ask?
    It is fairly straight forward to derive:

    <br />
(F\bf{v})' = F'\bf{v} + F\bf{v'} = A(F\bf{v}) + \bf{b} \implies<br />
F\bf{v'} = \bf{b} \implies<br />
\bf{v'} = F^{-1}\bf{b} .

    IMPORTANT: You must make sure that  F^{-1} exists in order to solve for v !!!! (not factorial)

    Finally, using integration, we have:

    <br />
\bf{x}(t) = F(t) \int F(t)^{-1} \bf{b}(t)dt


    I'll now rewrite the original problem in latex so it is more palatable to the eye:

    Find the general solution of
     <br />
\bf{x}' = A\bf{x} + \begin{bmatrix} 0 \\ 1 \end{bmatrix} , whereby  A = \begin{bmatrix} 1 & 2 \\ 0 & -1 \end{bmatrix} .


    Does this help?
    Last edited by abender; December 29th 2009 at 05:13 PM.
    Follow Math Help Forum on Facebook and Google+

  3. #3
    MHF Contributor
    Jester's Avatar
    Joined
    Dec 2008
    From
    Conway AR
    Posts
    2,375
    Thanks
    48
    Quote Originally Posted by abender View Post
    The idea of variation of parameters is to seek the solution to
    \bf{x}' = A\bf{x} + \bf{b}
    in the form
     <br />
\bf{v}_1(t)\bf{x}_1(t) + \bf{v}_2 (t)\bf{x}_2 (t) = F(t)\bf{v}(t)
    whereby  F(t) = \begin{bmatrix} x_1 & x_2 \end{bmatrix} is a fundamental matrix.

    Note that the constant vector c of coefficients is replaced by an unknown function v(t).

    What is the equation for our unknown function v(t) you ask?
    It is fairly straight forward to derive:
    <br />
(F\bf{v})' = F'\bf{v} + F\bf{v'} = A(F\bf{v}) + \bf{b} \implies<br />
F\bf{v'} = \bf{b} \implies<br />
\bf{v'} = F^{-1}\bf{b} .
    IMPORTANT: You must make sure that  F^{-1} exists in order to solve for v !!!! (not factorial)

    Finally, using integration, we have:
    <br />
\bf{x}(t) = F(t) \int F(t)^{-1} \bf{b}(t)dt<br />
    I'll now rewrite the original problem in latex so it is more palatable to the eye:

    Find the general solution of
     <br />
\bf{x}' = A\bf{x} + \begin{bmatrix} 0 \\ 1 \end{bmatrix} , whereby  A = \begin{bmatrix} 1 & 2 \\ 0 & -1 \end{bmatrix} .
    Does this help?
    I believe this is an overkill for this particular system.

    Quote Originally Posted by matlabnoob View Post
    ok,so again, reading my lecture notes,i bumped into this problem..

    consider
    A = (1 2,
    0 -1)

    find the general solution of
    x' = Ax + ( 0,1 )

    write x = (x, y) then the ODE is :
    x' = x+2y
    y' = -y + 1

    y' = -y+1 implies y(t) = Ce^-t + 1 (how did one obtain this?? ) (*)

    The ODE for x(t) is x' = x + 2(C(e^-t) + 1) (**)
    This linear inhomogeneous ODE gives x(t) = -2 - c(e^-t) + d(e^t) again how does one get this? i integrated but i cant get this. after trying for hours..i simply gave up ='(

    thank you !
    First question (*) above. It is a separable ODE - separate and integrate.
    Second question (**) is linear with the integrating factor \mu = e^{-t}.
    Follow Math Help Forum on Facebook and Google+

  4. #4
    Senior Member
    Joined
    Mar 2008
    From
    Pennsylvania, USA
    Posts
    269
    Thanks
    37
    Or perhaps I have an easier way of doing this.

    We have a matrix differential equation of the form

    <br />
\bf{x}'(t) = A\bf{x}(t) + \bf{b}(t)


    whereby  A is a constant matrix.

    Since  A is a constant matrix, if we can calculate <br />
e^{tA} , then we can find the solution to the system. So, we make  e^{tA} an integrating factor and multiply throughout:

    <br />
e^{-tA} \bf{x}'(t) = e^{-tA} A\bf{x}(t) + e^{-tA} \bf{b}(t)

     \implies<br />
e^{-tA} \bf{x}'(t) - e^{-tA} A\bf{x}(t) = e^{-tA} \bf{b}(t)

     \implies<br />
\frac{d}{dt} \big{[} e^{-tA} \bf{x}(t) \big{]} = e^{-tA} \bf{b}(t)

    <br />
 \implies \bf{x}(t) = e^{tA} \big{[} \int^t_0 <br />
e^{-\mu A} \bf{b}(\mu) d\mu<br />
\big{]}
    Last edited by abender; December 29th 2009 at 05:17 PM.
    Follow Math Help Forum on Facebook and Google+

  5. #5
    Member
    Joined
    Nov 2009
    Posts
    75
    thanks!!im still trying to follow through


    im not v.good with things unless they're worked out directly from how they originally are.but i have to know the method anyway
    Follow Math Help Forum on Facebook and Google+

  6. #6
    Senior Member
    Joined
    Mar 2008
    From
    Pennsylvania, USA
    Posts
    269
    Thanks
    37
    You're welcome and I appreciate your appreciation.

    Which part of the problem are having the most difficulty with? Is it the "setting things up" part? Crunching the integral? Just getting started??

    -Andy
    Follow Math Help Forum on Facebook and Google+

  7. #7
    Member
    Joined
    Nov 2009
    Posts
    75
    Quote Originally Posted by abender View Post
    You're welcome and I appreciate your appreciation.

    Which part of the problem are having the most difficulty with? Is it the "setting things up" part? Crunching the integral? Just getting started??

    -Andy
    the main things im having difficult with (i still dont know! spend the night trying to figure out. aah.. maths takes me forever to solve)
    .. they are :

    y' = -y+1 implies y(t) = Ce^-t + 1

    ok so for this i integrated just the homogeneous equation first, which i took as dy/dt + y = 0 and i got Ce^-t from that! but how did the +1 get there too? =S.... im not sure how to solve this without making errors or forgetting the +1 ..

    and.. when i tried to integrate..
    x(t) is x' = x + 2(C(e^-t) + 1)

    i couldnt get..
    x(t) = -2 - c(e^-t) + d(e^t)

    i tried integrating by parts... substitution... it wont work



    Follow Math Help Forum on Facebook and Google+

  8. #8
    MHF Contributor
    Jester's Avatar
    Joined
    Dec 2008
    From
    Conway AR
    Posts
    2,375
    Thanks
    48
    Quote Originally Posted by matlabnoob View Post
    the main things im having difficult with (i still dont know! spend the night trying to figure out. aah.. maths takes me forever to solve)
    .. they are :

    y' = -y+1 implies y(t) = Ce^-t + 1

    ok so for this i integrated just the homogeneous equation first, which i took as dy/dt + y = 0 and i got Ce^-t from that! but how did the +1 get there too? =S.... im not sure how to solve this without making errors or forgetting the +1 ..

    and.. when i tried to integrate..
    x(t) is x' = x + 2(C(e^-t) + 1)

    i couldnt get..
    x(t) = -2 - c(e^-t) + d(e^t)

    i tried integrating by parts... substitution... it wont work


    First ODE \frac{dy}{dt} = - (y-1) so \frac{dy}{y-1} = - dt so \ln | y-1 | = - t + \ln c. Exponentiating gives y = ce^{-t} + 1.

    Second ODE \frac{dx}{dt} - x = 2\left(c e^{-t} + 1 \right). The integrating factor is

    \mu = e^{-t} so e^{-t} \left( \frac{dx}{dt} - x\right) = e^{-t} 2\left(c e^{-t} + 1 \right).

    Thus, \frac{d}{dt} \left( x e^{-t} \right) = 2c e^{-2t} + 2 e^{-t} and integrating gives x e^{-t} = -c e^{-2t} - 2 e^{-t} + d. Isolating x gives x = -c e^{-t} - 2 + de^{t}
    Follow Math Help Forum on Facebook and Google+

  9. #9
    Senior Member
    Joined
    Mar 2008
    From
    Pennsylvania, USA
    Posts
    269
    Thanks
    37
    Good morning.


    The question asks us to find the general solution to

    <br />
\boldsymbol{x}'(t) = A\boldsymbol{x}(t) + \boldsymbol{b}(t) <br />

    whereby

     A = \begin{bmatrix} 1 & 2 \\ 0 & -1 \end{bmatrix} , \boldsymbol{b}(t) = \begin{bmatrix} 0 \\ 1 \end{bmatrix} .

    Just in case this it is a source of confusion for you, I'll clarify that \boldsymbol{x} can be (and in this post, is) expressed as follows:

     <br />
\boldsymbol{x}(t) = \begin{bmatrix} x \\ y \end{bmatrix}<br />
.


    Now I will generalize much less and try to spell out each step and computation for you. It will take a bit longer, but perhaps you need to see things this way.

    We begin by plugging in what we know into
    <br />
\boldsymbol{x}'(t) = A\boldsymbol{x}(t) + \boldsymbol{b}(t) <br />
.

    We get

     \begin{bmatrix} x' \\ y' \end{bmatrix} =<br />
\begin{bmatrix} 1 & 2 \\ 0 & -1 \end{bmatrix}<br />
\begin{bmatrix} x \\ y \end{bmatrix} + <br />
\begin{bmatrix} 0 \\ 1 \end{bmatrix} = <br />
<br />
\begin{bmatrix} x + 2y \\ -y \end{bmatrix} + <br />
\begin{bmatrix} 0 \\ 1 \end{bmatrix} =<br />
<br />
\begin{bmatrix} x+2y \\ -y +1 \end{bmatrix} .


    So we have

     \begin{bmatrix} x' \\ y' \end{bmatrix} =<br />
\begin{bmatrix} x+2y \\ -y +1 \end{bmatrix} .

    Now we multiply both sides by the integrating factor  e^{-tA} .

    <br />
e^{-tA} \begin{bmatrix} x' \\ y' \end{bmatrix} = <br />
e^{-tA} \begin{bmatrix} x+2y \\ -y +1 \end{bmatrix} <br />

    Now, let's explain e^{-tA} , which is called a matrix exponential. I'll give a general definition:

    Let M be a an n x n matrix. e^M is the matrix exponential of M , which is expressed by the power series

    e^{M} = \sum_{k=0}^\infty \frac{1}{k!} M^k .

    Note that the exponential of M is well-defined, as the series always converges.

    Are you following so far?
    Last edited by abender; December 30th 2009 at 12:00 PM.
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. General Solution of a differential solution
    Posted in the Differential Equations Forum
    Replies: 4
    Last Post: September 11th 2010, 02:49 AM
  2. general solution of ode
    Posted in the Differential Equations Forum
    Replies: 3
    Last Post: April 19th 2010, 11:24 PM
  3. Finding the general solution from a given particular solution.
    Posted in the Differential Equations Forum
    Replies: 5
    Last Post: October 7th 2009, 01:44 AM
  4. find the general solution when 1 solution is given
    Posted in the Differential Equations Forum
    Replies: 4
    Last Post: March 4th 2009, 09:09 PM
  5. General solution help
    Posted in the Trigonometry Forum
    Replies: 4
    Last Post: August 14th 2008, 01:42 AM

Search Tags


/mathhelpforum @mathhelpforum