MATH 2210Q Chapter Notes - Chapter 10, 7: Diagonalizable Matrix, Spectral Theorem, Diagonal Matrix
Document Summary
Markov chain - mathematical model for movement between states. Transition probability - probability that the chain moves from state j to state i in one step. Transition matrix p - matrix of transition probabilities. If there are m states, then the transition matrix would be an m m matrix. Probabilities that the chain is in each of the possible states after n steps. Probability vectors - entries must equal 1, state vectors are probability vectors. Initial probability vector - state vector x0. Stochastic matrix - matrix which columns are probability vectors, transition matrix is a stochastic matrix xn = p n. For n=1, 2, state vectors for the chain are related by x. For a finite-state markov chain, the process must have the following properties xn+1 = p n x or. Since the values in the vector depend only on the transition matrix p and on.