Sasha and Pedro meet every Tuesday for a game of backgammon. They find that after winning a game, sasha has a 65% probability of winning the next game. Similarly, Pedro has a 60% probability of winning after he has won a game. Pedro won the game last week.
QUESTION: a) If Pedro and Sasha play 100 games, how many games is each player likely to win?
Your question cannot simply be answered by using distributions such as Binomial or Poisson. You need to have the knowledge of a 2 x 2 Markov Chain, so that you can analyze the 100th transition/state.
All Markov chains are mixed probability distributions based on if a given event(s) happens.
While this is an example of a Markov chain you don't need to know about Markov chains to solve it.
Suppose that in the long run wins a proportion of the games and wins a proportion of the games, then .
Now suppose a game is played. the probability that won is and that won is , so the probability that wins the following game is:
But if we have a limiting probablity , so we have:
so:
Now in a run 100 of games the distribution of wins will be affected by the initial conditions, and we know nothing of the win probabilities in the abscence of a previous result, but to a hand waveing approximation we can say that each player will have won about half of the games.
RonL