Question

the Markov chain on S = {1, 2, 3} with transition matrix p is 0 1...

the Markov chain on S = {1, 2, 3} with transition matrix

p is

0 1 0

0 1/2 1/2;

1/2 0 1/2

We will compute the limiting behavior of pn(1,1) “by hand”.

(A) Find the three eigenvalues λ1, λ2, λ3 of p. Note: some are complex.

(B) Deduce abstractly using linear algebra and part (A) that we can write

pn(1, 1) = aλn1 + bλn2 + cλn3
for some constants a, b, c. Don’t find these constants yet.

(C) The first few values of pn (1, 1) are easy to write down: they are p0 (1, 1) = 1,p1(1,1) = 0, and p2(1,1) = 0. Use these and part (B) to find an explicit formula for pn(1, 1).

(D) What is the limiting behavior of pn(1, 1) as n → ∞?

Homework Answers

Answer #1

We need to find the probability of going from state 3 to 0 without going to state 4

The markov chain can follow the sequence - 3 2 1 0 which takes the probability 1/2*1.2*1/2

it can also return from state 2 to 3 and then repeat the process.

Similarly for state 2 and 1

Thus the total probability should be given by

1/2(1/2+1/2^2+1/2^3+ 1/2^4+............) (1/2+1/2^2+1/2^3+ 1/2^4+............)

= 1/2

Hence the answer is 1/2

Know the answer?
Your Answer:

Post as a guest

Your Name:

What's your source?

Earn Coins

Coins can be redeemed for fabulous gifts.

Not the answer you're looking for?
Ask your own homework help question
Similar Questions
Xn is a Markov Chain with state-space E = {0, 1, 2}, and transition matrix 0.4...
Xn is a Markov Chain with state-space E = {0, 1, 2}, and transition matrix 0.4 0.2     ? P = 0.6 0.3    ? 0.5 0.3    ? And initial probability vector a = [0.2, 0.3, ?] a) What are the missing values (?) in the transition matrix an initial vector? b) P(X1 = 0) = c) P(X1 = 0|X0 = 2) = d) P(X22 = 1|X20 = 2) = e) E[X0] = For the Markov Chain with state-space, initial vector, and...
A Markov chain X0, X1, ... on states 0, 1, 2 has the transition probability matrix...
A Markov chain X0, X1, ... on states 0, 1, 2 has the transition probability matrix P = {0.1 0.2 0.7        0.9 0.1   0        0.1 0.8 0.1} and initial distribution p0 = Pr{X0 = 0} = 0.3, p1 = Pr{X0 = 1} = 0.4, and p2 = Pr{X0 = 2} = 0.3. Determine Pr{X0 = 0, X1 = 1, X2 = 2}. Please tell me what it means of the initial distribution why initial distribution p0 = Pr{X0...
Consider a Markov chain {Xn; n = 0, 1, 2, . . . } on S...
Consider a Markov chain {Xn; n = 0, 1, 2, . . . } on S = N = {0, 1, 2, . . . } with transition probabilities P(x, 0) = 1/2 , P(x, x + 1) = 1/2 ∀x ∈ S, . (a) Show that the chain is irreducible. (b) Find P0(T0 = n) for each n = 1, 2, . . . . (c) Use part (b) to show that state 0 is recurrent; i.e., ρ00 =...
asasap Consider the Markov chain with state space {1, 2, 3} and transition matrix  ...
asasap Consider the Markov chain with state space {1, 2, 3} and transition matrix   1 2 1 4 1 4 0 1 0 1 4 0 3 4   Find the periodicity of the states. \ Let {Xn|n ≥ 0} be a finite state Markov chain. prove or disprove that all states are positive recurren
The matrix A= 1 0 0 -1 0 0 1 1 1 3x3 matrix has two...
The matrix A= 1 0 0 -1 0 0 1 1 1 3x3 matrix has two real eigenvalues, one of multiplicity 11 and one of multiplicity 22. Find the eigenvalues and a basis of each eigenspace. λ1 =..........? has multiplicity 1, with a basis of .............? λ2 =..........? has multiplicity 2, with a basis of .............? Find two eigenvalues and basis.
Consider the following. A = −5 12 −2 5 , P = −2 −3 −1 −1...
Consider the following. A = −5 12 −2 5 , P = −2 −3 −1 −1 (a) Verify that A is diagonalizable by computing P−1AP. P−1AP = (b) Use the result of part (a) and the theorem below to find the eigenvalues of A. Similar Matrices Have the Same Eigenvalues If A and B are similar n × n matrices, then they have the same eigenvalues. (λ1, λ2) =
Given the probability transition matrix of a Markov chain X(n) with states 1, 2 and 3:...
Given the probability transition matrix of a Markov chain X(n) with states 1, 2 and 3: X = [{0.2,0.4,0.4}, {0.3,0.3,0.4}, {0.2,0.6,0.2}] find P(X(10)=2|X(9)=3).
urgent Consider the Markov chain with state space {0, 1, 2, 3, 4} and transition probability...
urgent Consider the Markov chain with state space {0, 1, 2, 3, 4} and transition probability matrix (pij ) given   2 3 1 3 0 0 0 1 3 2 3 0 0 0 0 1 4 1 4 1 4 1 4 0 0 1 2 1 2 0 0 0 0 0 1   Find all the closed communicating classes Consider the Markov chain with state space {1, 2, 3} and transition matrix  ...
The transition probability matrix of a Markov chain {Xn }, n = 1,2,3……. having 3 states...
The transition probability matrix of a Markov chain {Xn }, n = 1,2,3……. having 3 states 1, 2, 3 is P = 0.1 0.5 0.4 0.6 0.2 0.2 0.3 0.4 0.3 * and the initial distribution is P(0) = (0.7, 0.2,0.1) Find: i. P { X3 =2, X2 =3, X1 = 3, X0 = 2} ii. P { X3 =3, X2 =1, X1 = 2, X0 = 1} iii. P{X2 = 3}
Consider a Markov chain with state space {1,2,3} and transition matrix. P= .4 .2 .4 .6...
Consider a Markov chain with state space {1,2,3} and transition matrix. P= .4 .2 .4 .6 0 .4 .2 .5 .3 What is the probability in the long run that the chain is in state 1? Solve this problem two different ways: 1) by raising the matrix to a higher power; and 2) by directly computing the invariant probability vector as a left eigenvector.