## Markov Chain of random variables from a primitive markov chain

I'm a little unsure if my approach here is correct, which is making it difficult to find the limit. Any and all help will be appreciated!

Let ${X_n}$ be a primitive finite state Markov chain with transition matrix $P$ and stationary distribution $w$. Define the process $Y_n$ by:
$Y_n = (X_{n-1}, X_n)$

Show that ${Y_n}$ is a Markov chain, and compute:
$lim_{n\to\infty} P(Y_n = (i,j))$

What I've done:
So I need to show that $Y_{n+1}$ is only determined by $Y_n$, ie:
$P(Y_{n+1} | Y_{n}, Y_{n-1}, ... , Y_0) = P(Y_{n+1} | Y_{n})$

So, suppose $Y_n = (a,b)$ a & b are just some numbers. Then:
$P(Y_{n+1} = (i,j) | Y_{n} = (a,b))$
Plugging in the definition of ${Y_n}$:
= $P((X_{n}=i, X_{n+1}= j) | (X_{n-1} = a, X_n=b))$
Because ${X_n}$ are primitive Markov-Chains, can I say that
= $P((X_{n}=i, X_{n+1}= j)$
because $X_{n+1}$ doesn't depend on $X_{n-1}$? This is basically where I'm stuck