| |
Theorem 1.7
Let a finite Markov chain with state space be also martingle. Then
(i) for
(ii) for
(iii) for
Proof of Theorem 1.7
Before we go through this simple proof we illustrate the concept of a martingle. Now a stochastic process is said to be martingle process if (i) for all and (ii) .Now taking expectation we have , i.e., 
(i) Let be independent random variables with mean 0, and let , then is a martingle
(ii) Let be independent random variables with mean 1, and let , then is a martingle
Now as this is a martingle as well as a Markov chain with transition probability matrix P, then we would definitely have , , then it means that
 , i.e.,  |
Now is satisfied for iff and for iff . Thus if a finite Markov chain is also martingle, then its terminal states are absorbing.
Assuming that there are no further closed sets we get that the interior states, 1,2,…, (l-1) are all transient, hence for and similarly we have for and for  |