# Markov Chains - Working out the transition matrix

• Mar 14th 2011, 01:50 PM
PeterPan01
Markov Chains - Working out the transition matrix
Hi guys,

An office has a large number of elevators in it, which break down following a geometric distribution with probability function (0.4)(0.6)^k. When 0, 1, 2, or 3 elevators are broken, the office staff can function correctly, and so the elevators are not fixed. When 4 or more elevators are broken, the office shuts down and all elevators are fixed for the beginning of next business day.

I get that I should use a Markov Chain, and that my states will be the number of elevators broken in a given day (so 0, 1, 2, 3, >=4).

The thing I am a little confused about is how to be certain that my probabilities are conditional probabilities.

For example, is Pr(1 elevator broken tomorrow | 1 elevator broken today) = Pr(no additional elevators break down) = 0.4(0.6)^0 = 0.4, because geometric distributions have independent probabilities and so it doesn't matter how many elevators have already broken down?

Or is Pr(1 elevator broken tomorrow | 1 elevator broken today) = Pr(no additional elevators break down | 1 elevator broken today) which I haven't a clue how to work out.