18.44 Lecture Notes - Lecture 30: Light-Year, Winny
Document Summary
Consider a sequence of random variables x , x , x , . each taking values in the same state space, which for now we take to be a nite set that we label by {0,1,,m}. Interpret x as state of the system at time n. Probability distribution for next state depends only on the current state (and not on the rest of the state history). Precisely, p{x = j|x = i ,x = i , , x = i , x = i } = p. For example, imagine a simple weather model with two states: rainy and sunny. If it"s rainy one day, there"s a . 5 chance it will be rainy the next day, a . 5 chance it will be sunny. In this climate, sun tends to last longer than rain. To describe a markov chain, we need to de ne p for any i, j {0, 1, , m}