A Markov chain has the transition matrix shown below:
P=[0.2, 0.5, 0.3]
[0.4, 0.6, 0]
[1, 0, 0]
1. find the two-step transition matrix
P(2)=
2. find the three-step transition matrix
P(3)=
3. find the three-step transition probability P^32(3)=