Results 1 to 2 of 2

Math Help - Markov Chains - Working out the transition matrix

  1. #1
    Newbie
    Joined
    Mar 2011
    Posts
    6

    Markov Chains - Working out the transition matrix

    Hi guys,

    An office has a large number of elevators in it, which break down following a geometric distribution with probability function (0.4)(0.6)^k. When 0, 1, 2, or 3 elevators are broken, the office staff can function correctly, and so the elevators are not fixed. When 4 or more elevators are broken, the office shuts down and all elevators are fixed for the beginning of next business day.

    I get that I should use a Markov Chain, and that my states will be the number of elevators broken in a given day (so 0, 1, 2, 3, >=4).

    The thing I am a little confused about is how to be certain that my probabilities are conditional probabilities.

    For example, is Pr(1 elevator broken tomorrow | 1 elevator broken today) = Pr(no additional elevators break down) = 0.4(0.6)^0 = 0.4, because geometric distributions have independent probabilities and so it doesn't matter how many elevators have already broken down?

    Or is Pr(1 elevator broken tomorrow | 1 elevator broken today) = Pr(no additional elevators break down | 1 elevator broken today) which I haven't a clue how to work out.

    Please help
    Follow Math Help Forum on Facebook and Google+

  2. #2
    Newbie
    Joined
    Mar 2011
    Posts
    6
    Oops. I meant the geometric distribution is memoryless, not that the probabilities are independent. But what that means, is that it doesn't matter if 3 are already broken down or if none are currently broken down, the probability of one more breaking down the next day will still be the same, right?
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. Eigenvectors and transition matrix of markov chains.
    Posted in the Advanced Statistics Forum
    Replies: 3
    Last Post: August 11th 2011, 02:10 PM
  2. Probability Transition Matrix and Markov Chains
    Posted in the Advanced Statistics Forum
    Replies: 0
    Last Post: January 23rd 2011, 08:45 AM
  3. Markov Transition Matrix
    Posted in the Advanced Statistics Forum
    Replies: 4
    Last Post: October 20th 2009, 05:22 AM
  4. Markov chain transition matrix
    Posted in the Advanced Statistics Forum
    Replies: 3
    Last Post: May 15th 2009, 03:52 AM
  5. Markov Chains & Transition Probabilities
    Posted in the Advanced Statistics Forum
    Replies: 5
    Last Post: April 26th 2009, 03:41 AM

Search Tags


/mathhelpforum @mathhelpforum