Let be a Bernoulli Process with success parameter p. Prove if

, where for and

and

where for and

are Markov chains.

Printable View

- Sep 5th 2009, 04:42 PMJimmy_WProve Bernoulli Process is Markov Chain
Let be a Bernoulli Process with success parameter p. Prove if

, where for and

and

where for and

are Markov chains. - Sep 8th 2009, 06:40 PMJimmy_W
Does anyone have any idea how to go about these?

- Sep 9th 2009, 01:58 AMpedrosorio

Let us suppose then, either with 0.5 probability of each case.

Given this, and knowing that

However, if we know that we know that

Taking this into account, we get that, and , imply .

Therefore,

It doesn't seem to be a Markov Chain