BU283 Lecture Notes - Lecture 4: Expected Value Of Perfect Information

29 views3 pages
School
Department
Course

Document Summary

Knowledge of sample/survey information can be used to revise probability estimates for states of nature (outcomes) Prior to obtaining information, p(states of nature) are called prior probabilities. With knowledge of conditional probabilities for outcomes (indicators) of the sample or survey, prior probabilities can be revised by bayes" theorem. Outcomes of this analysis are posterior probabilities. For each state of nature, multiply prior prob. by conditional prob. for indicator -- gives joint probabilities for states & indicator. Sum joint probabilities over all states -- gives marginal probability for indicator. For each state, divide joint prob. by marginal prob. of indicator -- gives posterior prob. dist. Improvement in the expected value of the decision strategy with the sample information: (evsi) expected value of sample information = expected value with sample information expected value without sample information. Efficiency = evsi/evpi: or if you want percentage efficiency = 100*evsi/evpi.

Get access

Grade+20% off
$8 USD/m$10 USD/m
Billed $96 USD annually
Grade+
Homework Help
Study Guides
Textbook Solutions
Class Notes
Textbook Notes
Booster Class
40 Verified Answers
Class+
$8 USD/m
Billed $96 USD annually
Class+
Homework Help
Study Guides
Textbook Solutions
Class Notes
Textbook Notes
Booster Class
30 Verified Answers

Related Documents