Question

Suppose that a production process changes states according to a Markov process whose one-step probability transition...

Suppose that a production process changes states according to a Markov process whose one-step probability transition matrix is given by

0 1 2 3

0 0.3 0.5 0 0.2

1 0.5 0.2 0.2 0.1

2 0.2 0.3 0.4 0.1

3 0.1 0.2 0.4 0.3

a. What is the probability that the process will be at state 2 after the 105th transition given that it is at state 0 after the 102 nd transition?

b. What is the probability that the process will be at state 2 after the 3rd transition if P(X0 = 0) = 0.2, P(X0 = 2) = 0.3 and P(X0 = 3) = 0.5.

c. Explain why this Markov chain has a steady state distribution and determine steady state probabilities.

d. Suppose that the process is “in-control” if it is in states 0 or 1 and is “out-of-control” if it is in states 2 or 3. In the long run, what fraction of the time is the process in-control?

e. In the long run, what fraction of transitions is from an out-of-control state to an in-control state?

f. What is the long run time average “out-of-control “cost if we incur 7TL and 4TL at every time the process visits state 2 and 3, respectively?

Homework Answers

Answer #1

Answer;

a)

we need P302

P <- matrix(c(.3,.5,0,.2,.5,.2,.2,.1,.2,.3,.4,.1,.1,.2,.4,.3),nrow=4,byrow=T)

P

P %^% 3

[,1] [,2] [,3] [,4]

[,1] 0.306 0.326 0.198 0.170

[,2] 0.324 0.306 0.206 0.164.

[,3] 0.306 0.316 0.220 0.158

[,4] 0.288 0.304 0.250 0.158

hence 0.198 is correct

b)

c(0.2, .3, .5) %*% (P %^% 3)

[,1] [,2] [,3] [,4]

[,1] 0.297 0.312 0.2306 0.1604

hence probability of being in state 2 = 0.2306

c)

steady state distribution

mu P = mu

0.3087071 0.3139842 0.2137203 0.1635884

d)

fraction of out-of-control = 0.21337203 + 0.1635884

= 0.3773087

Know the answer?
Your Answer:

Post as a guest

Your Name:

What's your source?

Earn Coins

Coins can be redeemed for fabulous gifts.

Not the answer you're looking for?
Ask your own homework help question
Similar Questions
The transition probability matrix of a Markov chain {Xn }, n = 1,2,3……. having 3 states...
The transition probability matrix of a Markov chain {Xn }, n = 1,2,3……. having 3 states 1, 2, 3 is P = 0.1 0.5 0.4 0.6 0.2 0.2 0.3 0.4 0.3 * and the initial distribution is P(0) = (0.7, 0.2,0.1) Find: i. P { X3 =2, X2 =3, X1 = 3, X0 = 2} ii. P { X3 =3, X2 =1, X1 = 2, X0 = 1} iii. P{X2 = 3}
A Markov chain X0, X1, ... on states 0, 1, 2 has the transition probability matrix...
A Markov chain X0, X1, ... on states 0, 1, 2 has the transition probability matrix P = {0.1 0.2 0.7        0.9 0.1   0        0.1 0.8 0.1} and initial distribution p0 = Pr{X0 = 0} = 0.3, p1 = Pr{X0 = 1} = 0.4, and p2 = Pr{X0 = 2} = 0.3. Determine Pr{X0 = 0, X1 = 1, X2 = 2}. Please tell me what it means of the initial distribution why initial distribution p0 = Pr{X0...
Xn is a Markov Chain with state-space E = {0, 1, 2}, and transition matrix 0.4...
Xn is a Markov Chain with state-space E = {0, 1, 2}, and transition matrix 0.4 0.2     ? P = 0.6 0.3    ? 0.5 0.3    ? And initial probability vector a = [0.2, 0.3, ?] a) What are the missing values (?) in the transition matrix an initial vector? b) P(X1 = 0) = c) P(X1 = 0|X0 = 2) = d) P(X22 = 1|X20 = 2) = e) E[X0] = For the Markov Chain with state-space, initial vector, and...
The state of a process changes daily according to a two-state Markov chain. If the process...
The state of a process changes daily according to a two-state Markov chain. If the process is in state i during one day, then it is in state j the following day with probability Pi, j , where P0,0 = 0.3, P0,1 = 0.7, P1,0 = 0.2, P1,1 = 0.8 Every day a message is sent. If the state of the Markov chain that day is i then the message sent is “good” with probability pi and is “bad” with...
Markov Chain Matrix 0 1 0 0.4 0.6 1 0.7 0.3 a.) suppose process begins at...
Markov Chain Matrix 0 1 0 0.4 0.6 1 0.7 0.3 a.) suppose process begins at state 1 and time =1 what is the probability it will be at state 0 at t=3. b.) What is the steady state distribution of the Markov Chain above
A network is modeled by a Markov chain with three states fA;B;Cg. States A, B, C...
A network is modeled by a Markov chain with three states fA;B;Cg. States A, B, C denote low, medium and high utilization respectively. From state A, the system may stay at state A with probability 0.4, or go to state B with probability 0.6, in the next time slot. From state B, it may go to state C with probability 0.6, or stay at state B with probability 0.4, in the next time slot. From state C, it may go...
In this question you will find the steady-state probability distribution for the regular transition matrix below...
In this question you will find the steady-state probability distribution for the regular transition matrix below with 3 states A, B, and C. A B C A 0.0 0.5 0.5 B 0.8 0.1 0.1 C 0.1 0.8 0.1 Give the following answers as fractions OR as decimals correct to at least 5 decimal places. What is the long term probability of being in state A? What is the long term probability of being in state B? What is the long...
Given the probability transition matrix of a Markov chain X(n) with states 1, 2 and 3:...
Given the probability transition matrix of a Markov chain X(n) with states 1, 2 and 3: X = [{0.2,0.4,0.4}, {0.3,0.3,0.4}, {0.2,0.6,0.2}] find P(X(10)=2|X(9)=3).
Let the markov chain consisting of states 0,1,2,3 have the transition probability matrix P = [0,0,1/2,1/2;...
Let the markov chain consisting of states 0,1,2,3 have the transition probability matrix P = [0,0,1/2,1/2; 1,0,0,0; 0,1,0,0; 0,1,0,0] Determine which state are recurrent and which are transient
Consider a Markov chain with state space {1,2,3} and transition matrix. P= .4 .2 .4 .6...
Consider a Markov chain with state space {1,2,3} and transition matrix. P= .4 .2 .4 .6 0 .4 .2 .5 .3 What is the probability in the long run that the chain is in state 1? Solve this problem two different ways: 1) by raising the matrix to a higher power; and 2) by directly computing the invariant probability vector as a left eigenvector.