The transition probability matrix of a Markov chain {Xn }, n =
1,2,3……. having 3
states 1, 2, 3 is P =
0.1 0.5 0.4
0.6 0.2 0.2
0.3 0.4 0.3
* and the initial distribution is P(0) = (0.7, 0.2,0.1)
Find:
i. P { X3 =2, X2 =3, X1 = 3, X0 = 2}
ii. P { X3 =3, X2 =1, X1 = 2, X0 = 1}
iii. P{X2 = 3}
Get Answers For Free
Most questions answered within 1 hours.