Results 1 to 2 of 2

Thread: Markov Process

  1. #1
    Junior Member BrooketheChook's Avatar
    Joined
    Sep 2009
    From
    Gold Coast
    Posts
    27

    Markov Process

    Hi,
    I was hoping someone could help me with this question. I am having trouble understanding the generator matrix Q, so this whole question is difficult for me .

    Let $\displaystyle X = (X_t)$ be a Markov process with state space $\displaystyle \{1,2,3\}$ and generator

    $\displaystyle Q = \begin{pmatrix} -3 & 2 & 1 \\ 2 & -5 & 3 \\ 1 & 3 & -4 \end{pmatrix}$

    (1) Determine the sojourn times in 1,2 and 3

    (2) Let $\displaystyle Y = \{Y_n\}$ be a discrete skeleton of $\displaystyle X$. Determine the stationary distribution $\displaystyle \nu$ of $\displaystyle Y$.

    (3) Determine the stationary distribution $\displaystyle \pi$ of $\displaystyle X$.

    Thanks in advance for any help.
    Follow Math Help Forum on Facebook and Google+

  2. #2
    MHF Contributor

    Joined
    Aug 2008
    From
    Paris, France
    Posts
    1,174
    Quote Originally Posted by BrooketheChook View Post
    Hi,
    I was hoping someone could help me with this question. I am having trouble understanding the generator matrix Q, so this whole question is difficult for me .

    Let $\displaystyle X = (X_t)$ be a Markov process with state space $\displaystyle \{1,2,3\}$ and generator

    $\displaystyle Q = \begin{pmatrix} -3 & 2 & 1 \\ 2 & -5 & 3 \\ 1 & 3 & -4 \end{pmatrix}$

    (1) Determine the sojourn times in 1,2 and 3

    (2) Let $\displaystyle Y = \{Y_n\}$ be a discrete skeleton of $\displaystyle X$. Determine the stationary distribution $\displaystyle \nu$ of $\displaystyle Y$.

    (3) Determine the stationary distribution $\displaystyle \pi$ of $\displaystyle X$.

    Thanks in advance for any help.
    Here's a short vade mecum about these generator matrices.

    You can understand the process $\displaystyle X$ as a discrete Markov chain with exponential holding times before two consecutive jumps. The parameters of the holding times are given by the diagonal of the matrix $\displaystyle Q$: $\displaystyle c(x)=-q_{xx} = \sum_{y\neq x} q_{xy}$. And the transitions are given by the proportions between holding times: $\displaystyle p_{xy}=\frac{q_{xy}}{\sum_{z\neq x} q_{xz}}$ (i.e. you normalize the lines (except the diagonal) of the matrix $\displaystyle Q$). Thus, if $\displaystyle Y$ is the "discrete skeleton" of $\displaystyle X$ (meaning that $\displaystyle Y_0=X_0$, $\displaystyle Y_1$ is the state of $\displaystyle X$ after its first jump, etc.), then $\displaystyle Y$ is a Markov chain of transition matrix $\displaystyle P$ given above (of course, $\displaystyle p_{xx}=0$ due to the definition of $\displaystyle Y$), and the process $\displaystyle X$ is described as follows: it stays at the initial position during a time $\displaystyle \tau_1$ which is exponential with parameter $\displaystyle c(X_0)$, then it jumps to $\displaystyle Y_1$, where it spends a time $\displaystyle \tau_2$ following an exponential distribution with parameter $\displaystyle c(Y_1)=c(X_{\tau_1})$, etc.

    It is equivalent to be given $\displaystyle Q$ or both $\displaystyle P=(p_{xy})_{x,y}$ and $\displaystyle (c(x))_x$.

    The matrix $\displaystyle Q$ itself can be used to describe the process, in the following way. Imagine that each "edge" $\displaystyle (i,j)$ between different states $\displaystyle i,j$, comes with a "clock" that ticks after an exponential time of parameter $\displaystyle q_{ij}$. When $\displaystyle X$ is at site $\displaystyle i$, it waits until the first of the clocks on the neighbouring edges ticks, and jumps across it; then all the clock are set back to 0 to decide the next jump, and so on.

    This gives the following "infinitesimal" description: for $\displaystyle i\neq j$, $\displaystyle P_i(X_h=j)=q_{ij}h + o(h)$ as $\displaystyle h\to 0$.

    One last thing: if $\displaystyle \nu$ is a stationary distribution for the discrete-time Markov chain $\displaystyle Y$ (you know how to find that), then $\displaystyle \mu(x)=\frac{\nu(x)}{c(x)}$ defines a stationary measure for $\displaystyle X$ (which you may need to normalize in order to get a probability distribution). Notice that $\displaystyle \frac{1}{c(x)}$ is the expectation of the holding time at site $\displaystyle x$ (expected value of an exponential r.v.), so that going from $\displaystyle \nu$ (holding time = 1) to $\displaystyle \mu$ (exponential holding times) consists in giving more weight to sites where the Markov chain $\displaystyle X$ stays longer. That makes sense.

    --
    So, you should:
    - write down what $\displaystyle c(i)$ is for each site (this is given by the diagonal, or by the sum of the off-diagonal terms); this gives you the parameters of the sojourn times at different sites.
    - write down the matrix $\displaystyle P$ obtained by normalizing the off-diagonal entries of $\displaystyle Q$ (i.e. dividing them by $\displaystyle c$ to get lines summing up to 1) and putting zeroes on the diagonal
    - compute the stationary distribution of $\displaystyle Y$ from $\displaystyle P$ (solving $\displaystyle \nu P=\nu$...)
    - deduce the stationary distribution of $\displaystyle X$ using the above formula ($\displaystyle \mu(x)=\frac{\nu(x)}{c(x)}$ and normalization)
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. Proof that summation of N Markov processes is a Markov process
    Posted in the Advanced Statistics Forum
    Replies: 0
    Last Post: Nov 28th 2011, 07:44 AM
  2. Markov Process
    Posted in the Advanced Statistics Forum
    Replies: 0
    Last Post: Apr 25th 2010, 10:56 AM
  3. Markov process help
    Posted in the Advanced Statistics Forum
    Replies: 0
    Last Post: Apr 11th 2010, 08:53 AM
  4. Markov Process
    Posted in the Advanced Statistics Forum
    Replies: 1
    Last Post: Oct 12th 2009, 09:57 AM
  5. Markov Process
    Posted in the Advanced Statistics Forum
    Replies: 0
    Last Post: Jan 12th 2009, 07:40 AM

Search Tags


/mathhelpforum @mathhelpforum