Discrete Time Markov Chain

Hey Guys,

A quick question I was wondering if you could help me or help me try and understand the way about approaching this question...

Let {Xt}t≥0 be a two-state Markov chain with state space S = {0, 1}, transition matrix:

and initial distribution

Define the New Stochastic Processes {Yt}t≥1 and {Zt}t≥1 as:

and,

**a) What are the State Spaces for these New Stochastic Processes?**

I have tried using the transition matrix and inputting the values of S = {0,1} into {Yt} and {Zt} but I'm not sure if I am on the right path.. Any Suggestions??

Thanks