Example 1.27
Let us consider the transition probability matrix as follows . So we first find the missing values and then find the different probabilities of transition from one state to another.
Note
The transition probability matrix along with the initial distribution (initial conditions) completely specifies the Markov chain.
Property
1. Strong Markov property : In case is the stopping time for a Markov chain, and consider two different events A and B, such that we have
then if we have , thus technically the evolution of the Markov chain starts afresh and repeats itself after it has reached the state . Remember all discrete Markov chain have this strong Markov property.
2. Markov chain of order : Consider a Markov chain, and in case if we have s, then it is a Markov chain of order . In general stock prices will be considered of order 1.
3. Markov chain of order 0: For a Markov chain if we have for , then it is a Markov chain of order 0.
|