CS 3341 Lecture Notes - Lecture 15: Markov Chain, Stochastic Process, Dot Product
Section 6.2 Markov Chains
Page 1 of 11
“Poets do not go mad; but chess-players do. Mathematicians go mad…but creative
artists very seldom.”
—G.K. Chesterton (from Orthodoxy)
Andrei Markov (1856-1922), a student of Chebyshev in St. Petersburg, first proposed
and developed the following idea.
The Markov property
Example: Let be a discrete-time stochastic process. Consider the following
conditional probabilities:
If is a stochastic process, then is Markov if and only if
for any , and
for any sets (consisting of possible values of )
We will say a stochastic process is a Markov chain if it has the following three
properties:
Section 6.2 Markov Chains
Page 2 of 11
Transition probabilities and time steps
If is a Markov chain, then we can most effectively analyze using transition
probabilities:
Transition Diagrams
We can use a directed graph to describe a Markov chain. Each vertex corresponds to a
state, and each arrow may be labeled with the corresponding transition probability.
Example: Draw the transition diagram for a Markov chain with transition probabilities
and .
Forecasting
We can use the probability rules to predict at what state the Markov chain will be any
number of steps ahead. That is, we can find the distribution of at a future time .
Example: If the Markov chain in the previous example is currently at state , find the
distribution of where it will be in two steps of time.