1
answer
0
watching
205
views
6 Nov 2019
make a markov chain model of a poker game where the states are the number of dollars a player has. With probability .3 a player wins 1 dollar in a period, with probability .4 a player loses 1 dollar, and with probability .3 a player stays the same. The game ends if the player loses all his or her money or if the player has 6 dollars( when the game ends, the Markov chain stays in its current state for ever). The Markov chain should have seen seven states, corresponding to te seven different amounts of money: 0,1,2,3,4,5,6 dollars. If you have$2, what is your probability distribution inthe next round? In the round after that?
make a markov chain model of a poker game where the states are the number of dollars a player has. With probability .3 a player wins 1 dollar in a period, with probability .4 a player loses 1 dollar, and with probability .3 a player stays the same. The game ends if the player loses all his or her money or if the player has 6 dollars( when the game ends, the Markov chain stays in its current state for ever). The Markov chain should have seen seven states, corresponding to te seven different amounts of money: 0,1,2,3,4,5,6 dollars. If you have$2, what is your probability distribution inthe next round? In the round after that?
Jean KeelingLv2
19 May 2019