# Math Help - Markov Processes

1. ## Markov Processes! (Markov Chain)

Hi there...

Let ${X}$ be a Markov chain and ${T}$ be a finite stopping time. Show that ${(X_{n+T})}_{n \geq 0}$ defines a Markov chain, taking care of stating the relevant filtration.

Any help would be greatly appreciated..

2. $\mathbb{P}(X_{n+T}|\sigma(X_{s+T}:s\leq m))=\sum \mathbb{P}(X_{n+i}|\sigma(X_{s+i}:s\leq m))\mathbb{P}(T=i)$
$=\sum \mathbb{P}(X_{n+i}|\sigma(X_{m+i}))\mathbb{P}(T=i) =\mathbb{P}(X_{n+T}|\sigma(X_{m+T}))$

Add rigour to liking.