Consider the Markov Chain with the following Transition Matrix. The initial probability distribution is equal to [P(XO = 1) = 0.3 P(XO = 2) = 0.2 P(XO = 3) = 0.5]. = - = = 0.5 0.2 P=10.3 0 0 0.5 0.37 0.7 0.5] (a) [2 points] Draw the state transition diagram. (b) (5 points] Which sequence is more probable? X0=1, X1=1, X3=1 or XO=2, X2=1, X3=1 (c) [5 points) Find P(X1=3, X2=2, X3=1) (d) [5 points] Find P(X2=1) (e) [5 points) Find the steady state probability distribution (f) [5 points] Starting on state 3, what is the expected number of transitions to visit state 1 for the first time?