A state vector X for a four-state Markov chain is such that the system is four times as likely to be in state 3 as in 1, is not in state 4, and is in state 2 with probability 0.2. Find the state vector X .
The components of a state vector must add up to 1. If [tex]x_i[/tex] denotes the probability that the system is in state [tex]i[/tex], and given the conditions above, then you have