A Markov chain X0, X1, ... on states 0, 1, 2 has the transition probability matrix
P = {0.1 0.2 0.7
0.9 0.1 0
0.1 0.8 0.1}
and initial distribution p0 = Pr{X0 = 0} = 0.3, p1 = Pr{X0 = 1} = 0.4, and p2 = Pr{X0 = 2} = 0.3. Determine
Pr{X0 = 0, X1 = 1, X2 = 2}.
Please tell me what it means of the initial distribution why
initial distribution p0 = Pr{X0 = 0} = 0.3, base on my understanding , it should be 0.1 however, why it is not 0.1, I didn't see 0.3 inside the matrix
Get Answers For Free
Most questions answered within 1 hours.