Consider a Markov chain with state space {1,2,3} and transition matrix.
P=
.4 | .2 | .4 |
.6 | 0 | .4 |
.2 | .5 | .3 |
What is the probability in the long run that the chain is in state 1? Solve this problem two different ways: 1) by raising the matrix to a higher power; and 2) by directly computing the invariant probability vector as a left eigenvector.
Get Answers For Free
Most questions answered within 1 hours.