Question

A network is modeled by a Markov chain with three states fA;B;Cg. States A, B, C...

A network is modeled by a Markov chain with three states fA;B;Cg. States A, B, C denote low,
medium and high utilization respectively. From state A, the system may stay at state A with
probability 0.4, or go to state B with probability 0.6, in the next time slot. From state B, it may
go to state C with probability 0.6, or stay at state B with probability 0.4, in the next time slot.
From state C, it may go to state B with probability 0.2, or go to state A with probability 0.3, or
stay at state C with probability 0.5, in the next time slot.
(a) Draw the Markov chain with all the states and transition probabilities.
(b) Find the transition probability matrix, P.
(c) What proportion of the time the network operates with high utilization?
(d) Assuming that this network is leased from a major network provider who charges per use as
follows: For state A (low utilization), state B (medium utilization), state C (high utilization),
the monthly cost is $1400, $2800, and $5600 respectively. Then how much would it cost for
leasing the network based on the above Markov model for one year.

Homework Answers

Know the answer?
Your Answer:

Post as a guest

Your Name:

What's your source?

Earn Coins

Coins can be redeemed for fabulous gifts.

Not the answer you're looking for?
Ask your own homework help question
Similar Questions
Suppose that a production process changes states according to a Markov process whose one-step probability transition...
Suppose that a production process changes states according to a Markov process whose one-step probability transition matrix is given by 0 1 2 3 0 0.3 0.5 0 0.2 1 0.5 0.2 0.2 0.1 2 0.2 0.3 0.4 0.1 3 0.1 0.2 0.4 0.3 a. What is the probability that the process will be at state 2 after the 105th transition given that it is at state 0 after the 102 nd transition? b. What is the probability that the...
Markov Chain Matrix 0 1 0 0.4 0.6 1 0.7 0.3 a.) suppose process begins at...
Markov Chain Matrix 0 1 0 0.4 0.6 1 0.7 0.3 a.) suppose process begins at state 1 and time =1 what is the probability it will be at state 0 at t=3. b.) What is the steady state distribution of the Markov Chain above
The transition probability matrix of a Markov chain {Xn }, n = 1,2,3……. having 3 states...
The transition probability matrix of a Markov chain {Xn }, n = 1,2,3……. having 3 states 1, 2, 3 is P = 0.1 0.5 0.4 0.6 0.2 0.2 0.3 0.4 0.3 * and the initial distribution is P(0) = (0.7, 0.2,0.1) Find: i. P { X3 =2, X2 =3, X1 = 3, X0 = 2} ii. P { X3 =3, X2 =1, X1 = 2, X0 = 1} iii. P{X2 = 3}
Xn is a Markov Chain with state-space E = {0, 1, 2}, and transition matrix 0.4...
Xn is a Markov Chain with state-space E = {0, 1, 2}, and transition matrix 0.4 0.2     ? P = 0.6 0.3    ? 0.5 0.3    ? And initial probability vector a = [0.2, 0.3, ?] a) What are the missing values (?) in the transition matrix an initial vector? b) P(X1 = 0) = c) P(X1 = 0|X0 = 2) = d) P(X22 = 1|X20 = 2) = e) E[X0] = For the Markov Chain with state-space, initial vector, and...
A Markov chain model for a copy machine has three states: working, broken and fixable, broken...
A Markov chain model for a copy machine has three states: working, broken and fixable, broken and unfixable. If it is working, there is 69.9% chance it will be working tomorrow and 0.1% chance it will be broken and unfixable tomorrow. If it broken and fixable today, there is a 49% chance it will be working tomorrow and 0.2% chance it will be unfixable tomorrow. Unfixable is, of course unfixable, so the probability that an unfixable machine is unfixable tomorrow...
1.Look for a case example for the Markov Chain with a minimum set of 3 states....
1.Look for a case example for the Markov Chain with a minimum set of 3 states. a. Classify the state b. Determine the period of each state c. If there is a state absorbing, determine the chance of being absorbed in the state if it starts from another state. d. determine P^10
The weather in Baton Rouge in the spring can be one of four possible states: Cold...
The weather in Baton Rouge in the spring can be one of four possible states: Cold (C), Mild (M), Hot (H), or Sizzling (S). Below is a transition probability matrix for a Markov Chain model of our weather with the states listed in the order above C,M,H,S. (1/2 1/4 1/4 0) (1/5 2/5 1/5 1/5) (1/5 1/5 1/5 2/5) (0 1/5 2/5 2/5) a. If todays weather is cold, what is the probability that it will be hot a week...
Markov Chains Three companies provide internet service in a city of 20,000 people. At the beginning...
Markov Chains Three companies provide internet service in a city of 20,000 people. At the beginning of the year, the market shares of each company are as follows: 11,800 people use company A, 6200 people use company B, and only 2000 people use company C. Each month, 5% of company A’s customers switch to company B, and 3% of company A’s customers switch to company C. During the same time, 4% of company B’s customers switch to A, and 6.5%...
Solve the following exercise: A computer is inspected at the end every hour. The computer may...
Solve the following exercise: A computer is inspected at the end every hour. The computer may be either working (up) or failed (down). If the computer is to be found up, the probability of remaining up for the next hour is 0.90. If it is down, the computer is repaired, which may require more than 1 hour. Whenever the computer is down (regardless of how long it has been), the probability of its still being down one hour later is...
An article includes information on the dynamic movement of certain assets between three states: (1) up,...
An article includes information on the dynamic movement of certain assets between three states: (1) up, (2) middle, and (3) down. For a particular class of assets, the following one-step transition probabilities were estimated from available data: Matrix P= [ .4069 .3536 .2395 .3995 .5588 .0417 .5642 .0470 .3888 ] Suppose that the initial valuation of this asset class found that 31.4% of such assets were in the “up” dynamic state, 40.5% were “middle,” and the remainder were “down.” (a)...