jgd8gmvb7w jgd8gmvb7w 15-04-2024 Mathematics Answered A Markov chain has the transition matrix shown below: P= [0.3, 0.2, 0.5] [0.2, 0.4, 0.4] [0.6, 0, 0.4] 1. find the two-step transition matrix P(2)= 2. find the three-step transition matrix P(3)=