MAT 316 Lecture Notes - Lecture 7: Normal Distribution, Central Limit Theorem, Unimodality

82 views7 pages
17 Sep 2016
School
Department
Course
Professor

Document Summary

Situation: you are performing bernoulli trials, the trials are all independent, p(success) = p remains constant for each trial. Note: this is the exact same situation as for the geometric distribution. The only difference between the geometric and negative. Geometric distribution is looking for one success, the negative. As was the case with the geometric, there are two different ways to define the variable. X = the number of the trial where the rth success occurs. Derive the pmf of the nb(r, p) distribution. Pmf: p(x) = pr(1 p)x r i{r, r+1, . Note: if x ~ geom(p), then x ~ nb(1, p). Note: the geometric (1) distribution is a special case of the negative. Any book that defines the geometric using version 1 will define the negative binomial using version. If the book uses the geometric (2) distribution, it will also use a different version of the negative binomial:

Get access

Grade+20% off
$8 USD/m$10 USD/m
Billed $96 USD annually
Grade+
Homework Help
Study Guides
Textbook Solutions
Class Notes
Textbook Notes
Booster Class
40 Verified Answers
Class+
$8 USD/m
Billed $96 USD annually
Class+
Homework Help
Study Guides
Textbook Solutions
Class Notes
Textbook Notes
Booster Class
30 Verified Answers

Related Documents