Results 1 to 6 of 6

Math Help - Markov Chains

  1. #1
    Junior Member
    Joined
    Oct 2008
    Posts
    70

    Markov Chains

    Consider the markov chain with the transition matrix:

    [1/2 1/3 1/6]
    [3/4 0 1/4]
    [ 0 1 0 ]

    the process is started in state 1; find the probability that it is in state 3 after two steps
    Follow Math Help Forum on Facebook and Google+

  2. #2
    Super Member

    Joined
    May 2006
    From
    Lexington, MA (USA)
    Posts
    11,914
    Thanks
    779
    Hello, morganfor!

    Consider the transition matrix: . A \;=\; \begin{pmatrix}\frac{1}{2} & \frac{1}{3} & \frac{1}{6} \\ \\[-4mm]<br />
\frac{3}{4} & 0 & \frac{1}{4} \\ \\[-4mm]<br />
0 & 1 & 0 \end{pmatrix}

    The process is started in state 1.
    Find the probability that it is in state 3 after two steps.
    We want A^2.

    . . A^2 \;=\;\begin{pmatrix}\frac{1}{2} & \frac{1}{3} & \frac{1}{6} \\ \\[-4mm] \frac{3}{4} & 0 & \frac{1}{4} \\ \\[-4mm] 0 & 1 & 0 \end{pmatrix} \begin{pmatrix}\frac{1}{2} & \frac{1}{3} & \frac{1}{6} \\ \\[-4mm] \frac{3}{4} & 0 & \frac{1}{4} \\ \\[-4mm] 0 & 1 & 0 \end{pmatrix} . =\; \begin{pmatrix}\frac{1}{2} & \frac{1}{3} & {\color{red}\frac{1}{6}} \\ \\[-4mm]<br />
\frac{3}{8} & \frac{1}{2} & \frac{1}{8} \\ \\[-4mm]<br />
\frac{3}{4} & 1 & \frac{1}{4} \end{pmatrix}


    Therefore: . P(a_1\to a_3\text{, 2 steps}) \:=\:\frac{1}{6}

    Follow Math Help Forum on Facebook and Google+

  3. #3
    Junior Member
    Joined
    Jan 2009
    Posts
    56
    No need to calculate A^2, if you want to calculate  f^{2}_{1,3} which is the probability to get to state 3 first time after two steps from state 1.
    This probability is just: P1,1*P1,3+P1,2*P2,3, no need to calculate the whole matrix.
    Last edited by mr fantastic; August 24th 2009 at 04:23 AM. Reason: Fixed exponent
    Follow Math Help Forum on Facebook and Google+

  4. #4
    Junior Member
    Joined
    Mar 2009
    Posts
    64
    Actually, if you're going to nitpick it is:

    P1,1*P1,3 + P1,2*P2,3+P1,3*P3,3

    Sure P3,3 = 0, but we don't want to give wrong formulas, do we?
    Follow Math Help Forum on Facebook and Google+

  5. #5
    Newbie
    Joined
    Aug 2009
    Posts
    1

    Unhappy exam tomorrow

    Got an exam tomorrow and just going over past papers and stuck on these questions would be greatful for any help


    Question 1.
    A random walk starts at k, where k is a positive integer, and takes place on the integers. It is assumed that Xn+1, the location of the walk after (n+ 1) steps,
    satisfies
    Xn+1 =

    Xn + 1 with probability p
    Xn - 1 with probability 1 - p
    where p + (1 - p) = 1.

    (a) Find the mean and variance of Xn, and comment on your result when p = 1

    2 .
    (b) Determine the value of p which maximizes the expression for the variance obtained in (a) above and comment.
    (c) Derive the probability distribution of Xn when k = 0.
    Question 2.

    (b) A Markov chain has the following transition matrix P, where:
    P =

    0 0 0 0 1
    1/8 1/2 3/8 0 0
    0 0 1/4 3/4 0
    0 0 1/3 0 2/3
    1 0 0 0 0

    (i) Draw the Chain Diagram showing the transitions between the states.
    (ii) Identify communicating classes and indicate whether they are closed or not.
    (iii) State which of the properties in (a) each state possesses.
    (iv) Determine whether this Markov chain has a unique stationary distribution and,
    if so,find it.
    Follow Math Help Forum on Facebook and Google+

  6. #6
    Junior Member
    Joined
    Jan 2009
    Posts
    56
    Quote Originally Posted by pedrosorio View Post
    Actually, if you're going to nitpick it is:

    P1,1*P1,3 + P1,2*P2,3+P1,3*P3,3

    Sure P3,3 = 0, but we don't want to give wrong formulas, do we?
    P3,3=0 always in case of f^{(n)}_{1,3}, n>1 cause it's the probability that we first arrive at state 3 after 2 steps, which means we don't need our first step to be towards state three.
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. Markov Chains
    Posted in the Advanced Statistics Forum
    Replies: 1
    Last Post: February 9th 2010, 05:58 PM
  2. Markov Chains
    Posted in the Advanced Statistics Forum
    Replies: 0
    Last Post: April 26th 2009, 09:17 PM
  3. Markov Chains
    Posted in the Advanced Statistics Forum
    Replies: 0
    Last Post: March 18th 2009, 11:53 AM
  4. Markov Chains
    Posted in the Advanced Statistics Forum
    Replies: 1
    Last Post: March 6th 2009, 10:34 AM
  5. Markov Chains
    Posted in the Advanced Statistics Forum
    Replies: 0
    Last Post: November 17th 2008, 10:33 AM

Search Tags


/mathhelpforum @mathhelpforum