1. ## Markov Chain Question

Am i right in thinking that it is impossible to have a Markov chain with an infinite number of transient states and an infinite number of positive recurent states??????

where a positive-recurrent state, is a recurrent state where the expected no. of transition to return is finite....

If anyone could let me know whether I am correct and this is impossible I would be very grateful...

2. False:

Start at n=0.

If n even let P(n->(n+1))=0.5 P(n->(n+2))=0.5

In n odd let P(n->n)=1

Then odd numbers are trivially recurrant, even numbers trivially transient.

3. thanks for that, what is the notation ->

and are the recurrent states here positive-recurrent??

Thanks...

4. P(a->b) = probability that Markov chain goes from state a to state b.

The expected number of steps before an odd number returns to itself is one, which is less than infinity, and so the odd numbers are positive recurrent.

According to my understanding of the definitions, this is a counter example.

5. thanks alot for that, really appriciate it..
Im just stuck on one last small part of one question asking for an example of a markov chain with 5 states and more than 1 stationary distribution... any ideas..

6. There are more trivial examples, but here is one from a question paper I did:

Continuous time Markov Chain with Q matrix:

-3, 2, 0, 0, 1
0, -3, 3, 0, 0
0, 5, -5, 0, 0
0, 0, 0, -2, 2
0, 0, 0, 1, -1

7. thanks, the work which im doing is all about disrete time markov chains, and i have to represent the chain wither with the transition matrix or diagram.. any thoughts???

8. Matrix:
0 0.5 0 0.5 0
0 0.5 0.5 0 0
0 0.5 0.5 0 0
0 0 0 0.5 0.5
0 0 0 0.5 0.5

should do it... its basically a similar set up to the one I posted earlier.