Sasha and Pedro meet every Tuesday for a game of backgammon. They find that after winning a game, sasha has a 65% probability of winning the next game. Similarly, Pedro has a 60% probability of winning after he has won a game. Pedro won the game last week.
QUESTION: a) If Pedro and Sasha play 100 games, how many games is each player likely to win?
All Markov chains are mixed probability distributions based on if a given event(s) happens.
Suppose that in the long run wins a proportion of the games and wins a proportion of the games, then .
Now suppose a game is played. the probability that won is and that won is , so the probability that wins the following game is:
But if we have a limiting probablity , so we have:
Now in a run 100 of games the distribution of wins will be affected by the initial conditions, and we know nothing of the win probabilities in the abscence of a previous result, but to a hand waveing approximation we can say that each player will have won about half of the games.