Consider the Markov chain with the state space {1,2,3} and transition matrix
P=
.2 | .4 | .4 |
.1 | .5 | .4 |
.6 | .3 | .1 |
What is the probability in the long run that the chain is in state 1?
Solve this problem two different ways:
1) by raising the matrix to a higher power; and
2) by directly computing the invariant probability vector as a left eigenvector.
Get Answers For Free
Most questions answered within 1 hours.