Math Help - Markov Process

1. Markov Process

Hi,
I was hoping someone could help me with this question. I am having trouble understanding the generator matrix Q, so this whole question is difficult for me .

Let $X = (X_t)$ be a Markov process with state space $\{1,2,3\}$ and generator

$Q = \begin{pmatrix} -3 & 2 & 1 \\ 2 & -5 & 3 \\ 1 & 3 & -4 \end{pmatrix}$

(1) Determine the sojourn times in 1,2 and 3

(2) Let $Y = \{Y_n\}$ be a discrete skeleton of $X$. Determine the stationary distribution $\nu$ of $Y$.

(3) Determine the stationary distribution $\pi$ of $X$.

Thanks in advance for any help.

2. Originally Posted by BrooketheChook
Hi,
I was hoping someone could help me with this question. I am having trouble understanding the generator matrix Q, so this whole question is difficult for me .

Let $X = (X_t)$ be a Markov process with state space $\{1,2,3\}$ and generator

$Q = \begin{pmatrix} -3 & 2 & 1 \\ 2 & -5 & 3 \\ 1 & 3 & -4 \end{pmatrix}$

(1) Determine the sojourn times in 1,2 and 3

(2) Let $Y = \{Y_n\}$ be a discrete skeleton of $X$. Determine the stationary distribution $\nu$ of $Y$.

(3) Determine the stationary distribution $\pi$ of $X$.

Thanks in advance for any help.

You can understand the process $X$ as a discrete Markov chain with exponential holding times before two consecutive jumps. The parameters of the holding times are given by the diagonal of the matrix $Q$: $c(x)=-q_{xx} = \sum_{y\neq x} q_{xy}$. And the transitions are given by the proportions between holding times: $p_{xy}=\frac{q_{xy}}{\sum_{z\neq x} q_{xz}}$ (i.e. you normalize the lines (except the diagonal) of the matrix $Q$). Thus, if $Y$ is the "discrete skeleton" of $X$ (meaning that $Y_0=X_0$, $Y_1$ is the state of $X$ after its first jump, etc.), then $Y$ is a Markov chain of transition matrix $P$ given above (of course, $p_{xx}=0$ due to the definition of $Y$), and the process $X$ is described as follows: it stays at the initial position during a time $\tau_1$ which is exponential with parameter $c(X_0)$, then it jumps to $Y_1$, where it spends a time $\tau_2$ following an exponential distribution with parameter $c(Y_1)=c(X_{\tau_1})$, etc.

It is equivalent to be given $Q$ or both $P=(p_{xy})_{x,y}$ and $(c(x))_x$.

The matrix $Q$ itself can be used to describe the process, in the following way. Imagine that each "edge" $(i,j)$ between different states $i,j$, comes with a "clock" that ticks after an exponential time of parameter $q_{ij}$. When $X$ is at site $i$, it waits until the first of the clocks on the neighbouring edges ticks, and jumps across it; then all the clock are set back to 0 to decide the next jump, and so on.

This gives the following "infinitesimal" description: for $i\neq j$, $P_i(X_h=j)=q_{ij}h + o(h)$ as $h\to 0$.

One last thing: if $\nu$ is a stationary distribution for the discrete-time Markov chain $Y$ (you know how to find that), then $\mu(x)=\frac{\nu(x)}{c(x)}$ defines a stationary measure for $X$ (which you may need to normalize in order to get a probability distribution). Notice that $\frac{1}{c(x)}$ is the expectation of the holding time at site $x$ (expected value of an exponential r.v.), so that going from $\nu$ (holding time = 1) to $\mu$ (exponential holding times) consists in giving more weight to sites where the Markov chain $X$ stays longer. That makes sense.

--
So, you should:
- write down what $c(i)$ is for each site (this is given by the diagonal, or by the sum of the off-diagonal terms); this gives you the parameters of the sojourn times at different sites.
- write down the matrix $P$ obtained by normalizing the off-diagonal entries of $Q$ (i.e. dividing them by $c$ to get lines summing up to 1) and putting zeroes on the diagonal
- compute the stationary distribution of $Y$ from $P$ (solving $\nu P=\nu$...)
- deduce the stationary distribution of $X$ using the above formula ( $\mu(x)=\frac{\nu(x)}{c(x)}$ and normalization)