1. ## probability

Sasha and Pedro meet every Tuesday for a game of backgammon. They find that after winning a game, sasha has a 65% probability of winning the next game. Similarly, Pedro has a 60% probability of winning after he has won a game. Pedro won the game last week.
QUESTION: a) If Pedro and Sasha play 100 games, how many games is each player likely to win?

2. Originally Posted by ives
Sasha and Pedro meet every Tuesday for a game of backgammon. They find that after winning a game, sasha has a 65% probability of winning the next game. Similarly, Pedro has a 60% probability of winning after he has won a game. Pedro won the game last week.
QUESTION: a) If Pedro and Sasha play 100 games, how many games is each player likely to win?
Are you familiar with Markov chains, transition matrices, initial state etc.?

3. no idea about the term what u said..........
sorry....
thanks for help

4. Originally Posted by ives
no idea about the term what u said..........
sorry....
thanks for help
Your question cannot simply be answered by using distributions such as Binomial or Poisson. You need to have the knowledge of a 2 x 2 Markov Chain, so that you can analyze the 100th transition/state.

All Markov chains are mixed probability distributions based on if a given event(s) happens.

5. haha however, my exam has passed, so easy
thanks a lot for ur helps

6. Originally Posted by ives
Sasha and Pedro meet every Tuesday for a game of backgammon. They find that after winning a game, sasha has a 65% probability of winning the next game. Similarly, Pedro has a 60% probability of winning after he has won a game. Pedro won the game last week.
QUESTION: a) If Pedro and Sasha play 100 games, how many games is each player likely to win?
While this is an example of a Markov chain you don't need to know about Markov chains to solve it.

Suppose that in the long run $\displaystyle S$ wins a proportion $\displaystyle s$ of the games and $\displaystyle P$ wins a proportion $\displaystyle p$ of the games, then $\displaystyle s+p=1$.

Now suppose a game is played. the probability that $\displaystyle S$ won is $\displaystyle s$ and that $\displaystyle P$ won is $\displaystyle (1-s)$ , so the probability that $\displaystyle S$ wins the following game is:

$\displaystyle s_1=s \times 0.65 + (1-s) \times 0.35$

But if we have a limiting probablity $\displaystyle s_1=s$, so we have:

$\displaystyle s=s \times 0.65 + (1-s) \times 0.35$

so:

$\displaystyle s=0.5$

Now in a run 100 of games the distribution of wins will be affected by the initial conditions, and we know nothing of the win probabilities in the abscence of a previous result, but to a hand waveing approximation we can say that each player will have won about half of the games.

RonL