Results 1 to 3 of 3

Math Help - Markov Chain problem

  1. #1
    Member
    Joined
    Mar 2009
    From
    Gold Coast, Queensland, Australia
    Posts
    105

    Markov Chain problem

    Given a markov chain with states { e_1,e_2,e_5} and the transition probability matrix

    P = \begin{bmatrix}0.2 & 0.8 & 0\\1 & 0 & 0\\0.7 & 0 & 0.3\end{bmatrix}

    which on of the following is false?
    a) e_1 communicates with e_2
    b) e_1 is accessible from e_3
    c) e_3 communicates with e_1
    d) e_2 is accessible from e_3
    Follow Math Help Forum on Facebook and Google+

  2. #2
    Super Member

    Joined
    May 2006
    From
    Lexington, MA (USA)
    Posts
    11,908
    Thanks
    766
    Hello, Bucephalus!

    Given a Markov chain with states { e_1,e_2,e_5} and the transition probability matrix

    . . P \;=\; \begin{bmatrix}0.2 & 0.8 & 0\\1 & 0 & 0\\0.7 & 0 & 0.3\end{bmatrix}

    Which one of the following is false?

    a) e_1 communicates with e_2
    I assume that "communicates" is a two-way relationship.
    . . That is: . e_1\to e_2 .and . e_2\to e_1

    Since . \begin{Bmatrix}P(e_1\to e_2) &=& 0.8 \\ P(e_2\to e_1) &=& 1\end{Bmatrix} . the statement is true.




    b) e_1 is accessible from e_3

    Since . P(e_3\to e_1) \:=\: 0.7, .the statement is true.




    c) e_3 communicates with e_1

    We have: . P(e_3\to e_1) \:=\:0.7

    . . . .but: . P(e_1\to e_3) \:=\:0

    e_1 and e_3 do not communicate.

    This statement is false.




    d) e_2 is accessible from e_3

    Since . \begin{Bmatrix}P(e_3\to e_1) &=& 0.7 \\<br />
P(e_1\to e_2) &=& 0.8 \end{Bmatrix}

    . . then: . P\left([e_3\to e_1] \wedge [e_1\to e_2]\right) \;=\;(0.7)(0.8)

    Hence: . P(e_3\to e_2) \:=\:0.56 . . . The statement is true.

    Follow Math Help Forum on Facebook and Google+

  3. #3
    Member
    Joined
    Mar 2009
    From
    Gold Coast, Queensland, Australia
    Posts
    105

    Awesome reply

    Thanks for your help.
    David.
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. Markov Chain of random variables from a primitive markov chain
    Posted in the Advanced Statistics Forum
    Replies: 0
    Last Post: October 19th 2011, 09:12 AM
  2. Markov Chain problem.
    Posted in the Advanced Statistics Forum
    Replies: 1
    Last Post: March 18th 2011, 04:40 AM
  3. Markov chain problem 2
    Posted in the Advanced Statistics Forum
    Replies: 0
    Last Post: April 26th 2010, 06:54 AM
  4. Markov Chain Problem
    Posted in the Advanced Statistics Forum
    Replies: 0
    Last Post: April 1st 2009, 08:29 PM
  5. Replies: 2
    Last Post: October 28th 2008, 07:32 PM

Search Tags


/mathhelpforum @mathhelpforum