Results 1 to 4 of 4

Math Help - Transition matrix for a Markov process

  1. #1
    Newbie
    Joined
    Aug 2012
    From
    South Carol;ina
    Posts
    2

    Transition matrix for a Markov process

    Need help solving the transition matrix for a Markov process. First I have to find the initial state if S0 =[.3 .7]
    State
    A B
    State A[0.3 0.7]
    = P
    B [0.9 0.1]
    Follow Math Help Forum on Facebook and Google+

  2. #2
    MHF Contributor
    Prove It's Avatar
    Joined
    Aug 2008
    Posts
    11,804
    Thanks
    1576

    Re: Transition matrix for a Markov process

    Quote Originally Posted by Taco View Post
    Need help solving the transition matrix for a Markov process. First I have to find the initial state if S0 =[.3 .7]
    State
    A B
    State A[0.3 0.7]
    = P
    B [0.9 0.1]
    This is almost unreadable. First of all, what do you mean by "solving the transition matrix"? You HAVE the transition matrix. It's \displaystyle \begin{align*}  \left[ \begin{matrix} 0.3 & 0.7 \\ 0.9 & 0.1 \end{matrix} \right] \end{align*}...
    Follow Math Help Forum on Facebook and Google+

  3. #3
    Newbie
    Joined
    Aug 2012
    From
    South Carol;ina
    Posts
    2

    Re: Transition matrix for a Markov process

    You have written the transition matrix correctly as it is in my problem. I tried but it did not work for me.....I have to find the first state matrix if the initial state
    isSo=[.3 .7]. This is so confusing to me. Thanks for your time.
    Follow Math Help Forum on Facebook and Google+

  4. #4
    MHF Contributor
    Prove It's Avatar
    Joined
    Aug 2008
    Posts
    11,804
    Thanks
    1576

    Re: Transition matrix for a Markov process

    Quote Originally Posted by Taco View Post
    You have written the transition matrix correctly as it is in my problem. I tried but it did not work for me.....I have to find the first state matrix if the initial state
    isSo=[.3 .7]. This is so confusing to me. Thanks for your time.
    That makes a little more sense. You are being asked to find \displaystyle \begin{align*} S_1 \end{align*}. Notice that \displaystyle \begin{align*} S_1 = T\,S_0 \end{align*}.
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. Markov Transition Matrix
    Posted in the Advanced Statistics Forum
    Replies: 4
    Last Post: October 20th 2009, 05:22 AM
  2. Transition matrix for Markov chain
    Posted in the Advanced Statistics Forum
    Replies: 0
    Last Post: May 8th 2009, 04:04 AM
  3. Markov Chain - Transition Matrix
    Posted in the Advanced Statistics Forum
    Replies: 0
    Last Post: April 9th 2009, 03:51 PM
  4. Transition Matrix (markov chain help)
    Posted in the Advanced Statistics Forum
    Replies: 0
    Last Post: April 1st 2009, 05:32 PM
  5. Transition matrix for Markov Chain
    Posted in the Advanced Statistics Forum
    Replies: 0
    Last Post: February 17th 2009, 01:26 AM

Search Tags


/mathhelpforum @mathhelpforum