When is an Irreducible Markov Chain is transient

May 2010
Hi everyone, I am new to this forum and would highly appreciate some advice on a question I have.

I need to show that an irreducible markov chain is transient if and only if for each state i, there is a state j not equal to i such that fji<1, where fji is the probability of eventually visiting state i given that the chain started in state j.

I appreciate any help or advice you can give on this?