Question

Initial state and limit state In the class I have shown you with the population example...

Initial state and limit state

In the class I have shown you with the population example of Pomona and Walnut with initial population for both equal to 100k, and limit state or steady state equal to 120K for Pomona and 80K for Walnut. The transition probabilities are 0.2 (or 20%) from Pomona to Walnut and 0.3 or 30% from Walnut to Pomona

This corresponds to the case of p(0) = [0.5, 0.5] and limiting state = [0.6, 0.4]. (see section 12.3, pages 450 – 451 for the notations).

Q: Does there exist a Markov chain M that has exactly the opposite, i.e. p(0) = [0.6, 0.4] with the limiting state

= [0.5, 0.5]? If so, show such example M (a 2x2 matrix) and prove that the limiting state is [0.5, 0.5]. If not, explain why it does NOT exist.

Homework Answers

Answer #1
Pomona Walnut
Initial state 100k 100k
limit state 120K 80K
transition probabilities 0.2 (or 20%) 0.3 (or 30%)

p(0) = [0.5, 0.5] and limiting state = [0.6, 0.4]

using state diagram,

P00 = 0.8 (1-p) P01 = 0.2 (p) P10 = 0.3 (q) P11 = 0.7 (1-q)

there are 2 ways this can happen

P00 P01 = 0.8 * 0.2 = 0.16

P01 P11 = 0.2 * 0.7 = 0.14

in two steps is,

P00 P01 + P01 P11 = 0.16 + 0.14 = 0.3

the state transition matrix for this Markov chain is

P = 0.5 0.5

0.6 0.4

if we square this matrix we get,

P2 = 0.55 0.45

0.54 0.64

no such Markov chain exists, since, P01 of the P2 matrix is not equal to the 0.3 (P00 P01 + P01 P11 )  value obtained.

Know the answer?
Your Answer:

Post as a guest

Your Name:

What's your source?

Earn Coins

Coins can be redeemed for fabulous gifts.

Not the answer you're looking for?
Ask your own homework help question
Similar Questions
ADVERTISEMENT
Need Online Homework Help?

Get Answers For Free
Most questions answered within 1 hours.

Ask a Question
ADVERTISEMENT