# Thread: Transition matrix for a Markov process

1. ## Transition matrix for a Markov process

Need help solving the transition matrix for a Markov process. First I have to find the initial state if S0 =[.3 .7]
State
A B
State A[0.3 0.7]
= P
B [0.9 0.1]

2. ## Re: Transition matrix for a Markov process

Originally Posted by Taco
Need help solving the transition matrix for a Markov process. First I have to find the initial state if S0 =[.3 .7]
State
A B
State A[0.3 0.7]
= P
B [0.9 0.1]
This is almost unreadable. First of all, what do you mean by "solving the transition matrix"? You HAVE the transition matrix. It's \displaystyle \displaystyle \begin{align*} \left[ \begin{matrix} 0.3 & 0.7 \\ 0.9 & 0.1 \end{matrix} \right] \end{align*}...

3. ## Re: Transition matrix for a Markov process

You have written the transition matrix correctly as it is in my problem. I tried but it did not work for me.....I have to find the first state matrix if the initial state
isSo=[.3 .7]. This is so confusing to me. Thanks for your time.

4. ## Re: Transition matrix for a Markov process

Originally Posted by Taco
You have written the transition matrix correctly as it is in my problem. I tried but it did not work for me.....I have to find the first state matrix if the initial state
isSo=[.3 .7]. This is so confusing to me. Thanks for your time.
That makes a little more sense. You are being asked to find \displaystyle \displaystyle \begin{align*} S_1 \end{align*}. Notice that \displaystyle \displaystyle \begin{align*} S_1 = T\,S_0 \end{align*}.