A Markov chain has the transition matrix shown below:
P= [0.3, 0.2, 0.5]
[0.2, 0.4, 0.4]
[0.6, 0, 0.4]
1. find the two-step transition matrix
P(2)=
2. find the three-step transition matrix
P(3)=



Answer :