3 part question:
A. The Land of Oz is blessed by many things, but not by good weather. They never have two nice days in a row. If they have a nice day, they are just as likely to have snow as rain the next day. If they have snow or rain, they have an even chance of having the same the next day. If there is change from snow or rain, only half of the time is this a change to a nice day. Express the daily weather in Oz with a Markov chain.
B. Dorthy was travelling and does not know what the weather was in Oz yesterday. Suppose Dorothy assigns a 1/3 probability to each state. Calculate the probability that tomorrow it will be a nice day.
C. Are the Markov chains from A and B Ergodic?
a)
B)
I = [1/3 1/3 1/3]
P = [0.5 0.25 0.25 ;
0.5 0 0.5
0.25 0.25 0.5]
I *P = [0.41666 0.16666 0.4166666]
hence probability that tomorrow it will be a nice day = 0.166666
c)
A Markov chain is called an ergodic chain if it is possible to go from every state to every state (not necessarily in one move).
Yes,
Get Answers For Free
Most questions answered within 1 hours.