Results 1 to 1 of 1

Math Help - Markov Chain questions

  1. #1
    Newbie
    Joined
    Apr 2012
    From
    Iowa
    Posts
    10

    Markov Chain questions

    (1) Find X2 (the probability distribution of the system after two observations) for the distribution vector X0 and the transition matrix T.

    Xo = [1/6]
    [5/6]
    [ 0 ]
    T= [1/2 1/3 1/2]
    [0 1/3 1/4]
    [1/2 1/3 1/4]

    X2 = ______
    ______
    ______


    (2) Find the steady-state vector for the transition matrix.

    [5/7 4/7]
    [2/7 3/7]

    X= _____
    _____
    _____

    (3) Find the steady-state vector for the transition matrix.

    [.6 .1 0 ]
    [.4 .8 .6]
    [0 .1 .4]

    X= ___
    ___
    ___



    (4)The transition matrix for a Markov process is given by
    1 2
    State 1 [1/3 5/6]
    7=
    State 2 [2/3 1/6]

    (a) Given that the outcome state 1 has occurred, what is the probability that the next outcome of the experiment will be state 2?


    (b) If the initial-state distribution is given by


    State 1 [1/5]
    Xo=
    State 2 [4/5]

    find TX0, the probability distribution of the system after one observation.

    X1 = ___
    ____
    ____
    Last edited by foz124; April 20th 2012 at 04:24 PM.
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. Markov Chain of random variables from a primitive markov chain
    Posted in the Advanced Statistics Forum
    Replies: 0
    Last Post: October 19th 2011, 08:12 AM
  2. Markov Chain
    Posted in the Statistics Forum
    Replies: 12
    Last Post: August 8th 2011, 10:38 PM
  3. Markov Chain Matrix Questions
    Posted in the Advanced Statistics Forum
    Replies: 0
    Last Post: February 7th 2011, 05:16 AM
  4. Markov chain
    Posted in the Advanced Statistics Forum
    Replies: 0
    Last Post: March 24th 2010, 09:58 AM
  5. Replies: 2
    Last Post: October 28th 2008, 06:32 PM

Search Tags


/mathhelpforum @mathhelpforum