Markov Chains
Stationary process
A stochastic process is said to be stationary if the joint distribution of any subset of the sequence of random variables is invariant with respect to shifts in the time index; that is,
for every and every shift and for all .
Markov process
A discrete stochastic process is said to be a Markov chain or a Markov process if for
In this case, the joint probability mass function of the random variables can be written as
Time invariance
The Markov chain is said to be time invariant if the conditional probability does not depend on ; that is, for
for all .