Need help solving the transition matrix for a Markov process. First I have to find the initial state if S0 =[.3 .7]

State

A B

State A[0.3 0.7]

= P

B [0.9 0.1]

Printable View

- Aug 5th 2012, 06:08 AMTacoTransition matrix for a Markov process
Need help solving the transition matrix for a Markov process. First I have to find the initial state if S0 =[.3 .7]

State

A B

State A[0.3 0.7]

= P

B [0.9 0.1] - Aug 5th 2012, 06:11 AMProve ItRe: Transition matrix for a Markov process
- Aug 5th 2012, 06:26 AMTacoRe: Transition matrix for a Markov process
You have written the transition matrix correctly as it is in my problem. I tried but it did not work for me.....I have to find the first state matrix if the initial state

isSo=[.3 .7]. This is so confusing to me. Thanks for your time. - Aug 5th 2012, 06:36 AMProve ItRe: Transition matrix for a Markov process