# Math Help - Markov chain/transition matrix problem.

1. ## Markov chain/transition matrix problem.

Hey, I am currently learning about markov chains in my data management course, and I am stuck on a problem.

Question:
Three people - John, Joan and Kim - throw a ball to each other. There is a probability of $\frac{1}{3}$ that John will throw the ball to Joan.
There is a probability of $\frac {1}{2}$ that Joan will throw the ball to Kim.
There is a probability of $\frac {1}{4}$ that Kim will throw the ball to John.
a) Express this Markov chain as a transition matrix.

Solution:
Basically I am confused as to where to find the other probabilities. This is my transition matrix so far:
-----Joan-----Kim-----John
John-- 1/3-----------------
Joan-----------1/2---------
Kim---------------------1/4
(hopefully that turns out okay, I don't know how to do it using latex)
I know that each row is going to equal 1, so if I could find another probability in each row, I could finish the matrix. However, I do not know how to find the other probabilities.

2. Originally Posted by Kakariki
Hey, I am currently learning about markov chains in my data management course, and I am stuck on a problem.

Question:
Three people - John, Joan and Kim - throw a ball to each other. There is a probability of $\frac{1}{3}$ that John will throw the ball to Joan.
There is a probability of $\frac {1}{2}$ that Joan will throw the ball to Kim.
There is a probability of $\frac {1}{4}$ that Kim will throw the ball to John.
a) Express this Markov chain as a transition matrix.

Solution:
Basically I am confused as to where to find the other probabilities. This is my transition matrix so far:
-----Joan-----Kim-----John
John-- 1/3-----------------
Joan-----------1/2---------
Kim---------------------1/4
(hopefully that turns out okay, I don't know how to do it using latex)
I know that each row is going to equal 1, so if I could find another probability in each row, I could finish the matrix. However, I do not know how to find the other probabilities.