1. Markov chain transition matrix

In a certain production process, items are pass through two manufacturing stages. At the end of each stage, the items are either scrapped with a probability of 0.15, sent through the stage again for rework with a probability of 0.25, or passed on the the next step with a probability of 0.6. Describe this as a Markov chain and set up the transition matrix? What is the expected number of steps to absorption?

2. Hello,

Introduce a third state : the one where the product goes if it's scrapped. And if it's there, it cannot leave : $\mathbb{P}(\textcircled{3}\to \textcircled{3})=1$

And from your text, you have :
$\mathbb{P}(\textcircled{1}\to \textcircled{1})=0.25$
$\mathbb{P}(\textcircled{1}\to \textcircled{2})=0.6$
$\mathbb{P}(\textcircled{1}\to \textcircled{3})=0.15$

But I don't understand what happens for the second stage... ?

This gives the following transition matrix :

$\begin{pmatrix} 0.25 & 0.6 & 0.15 \\ ?&?&? \\ 0&0&1 \end{pmatrix}$

3. Originally Posted by Moo
But I don't understand what happens for the second stage... ?
Probably there should be introduced a fourth state $\textcircled{4}$ for the finished items: after $\textcircled{2}$, the item either is scrapped (goes to $\textcircled{3}$), is reworked (stays at $\textcircled{2}$) or leaves the line (goes to $\textcircled{4}$).

4. Originally Posted by veronica.white
In a certain production process, items are pass through two manufacturing stages. At the end of each stage, the items are either scrapped with a probability of 0.15, sent through the stage again for rework with a probability of 0.25, or passed on the the next step with a probability of 0.6. Describe this as a Markov chain and set up the transition matrix? What is the expected number of steps to absorption?