Transition matrix for a Markov process

• Aug 5th 2012, 05:08 AM
Taco
Transition matrix for a Markov process
Need help solving the transition matrix for a Markov process. First I have to find the initial state if S0 =[.3 .7]
State
A B
State A[0.3 0.7]
= P
B [0.9 0.1]
• Aug 5th 2012, 05:11 AM
Prove It
Re: Transition matrix for a Markov process
Quote:

Originally Posted by Taco
Need help solving the transition matrix for a Markov process. First I have to find the initial state if S0 =[.3 .7]
State
A B
State A[0.3 0.7]
= P
B [0.9 0.1]

This is almost unreadable. First of all, what do you mean by "solving the transition matrix"? You HAVE the transition matrix. It's \displaystyle \begin{align*} \left[ \begin{matrix} 0.3 & 0.7 \\ 0.9 & 0.1 \end{matrix} \right] \end{align*}...
• Aug 5th 2012, 05:26 AM
Taco
Re: Transition matrix for a Markov process
You have written the transition matrix correctly as it is in my problem. I tried but it did not work for me.....I have to find the first state matrix if the initial state
isSo=[.3 .7]. This is so confusing to me. Thanks for your time.
• Aug 5th 2012, 05:36 AM
Prove It
Re: Transition matrix for a Markov process
Quote:

Originally Posted by Taco
You have written the transition matrix correctly as it is in my problem. I tried but it did not work for me.....I have to find the first state matrix if the initial state
isSo=[.3 .7]. This is so confusing to me. Thanks for your time.

That makes a little more sense. You are being asked to find \displaystyle \begin{align*} S_1 \end{align*}. Notice that \displaystyle \begin{align*} S_1 = T\,S_0 \end{align*}.