| |
Assignment 1.11
If is the transition probability matrix, while is the eigen matrix, then show that we can express , and also prove that .
Finite irreducible Markov chain
Stationary distribution: If for a Markov chain with a transition probability matrix, and holds, where is a probability distribution such that in general we have , , then what is of interest to us is to study the initial conditions for which as , has a limiting value such that the initial state from which the stochastic process starts does not affect the value, i.e., is independent of the initial state from where we start. This would very simple mean that from where ever we start the limiting values of would have identical rows. This property is know as the property of ergodicity and the Markov chain is called ergodic.
From and and for every state we find , , but as when we have the initial condition.
Theorem 1.8
If state is persistent, then from every state , we can reach . |