Given the probability transition matrix of a Markov chain X(n)
with states 1, 2 and 3:
X =
[{0.2,0.4,0.4},
{0.3,0.3,0.4},
{0.2,0.6,0.2}]
find P(X(10)=2|X(9)=3).
We are given the probability transition matrix of a Markov chain X(n) with states 1, 2 and 3:
and we are required to find the probability P(X(10)=2|X(9)=3). Note that this is just the one step transition probability from state '3' to state '2' and is equal to the (3,2)th entry of the probability transition matrix. Thus, we get:
P(X(10)=2|X(9)=3) = (3,2)th entry of the probability transition matrix = 0.6 [ANSWER]
For any queries, feel free to comment and ask.
If the solution was helpful to you, don't forget to upvote it by clicking on the 'thumbs up' button.
Get Answers For Free
Most questions answered within 1 hours.