a markov chain has the transition matrix p below. if the chain is in state 1 on the 3rd observation, what is the probability that it will be in state 2 on the 6th observation?



Answer :