More examples of probability mass functions
Random walks
. Let the probability of getting a 1 in a single trial be .
.
.
Now, iff the number of ones in is . So,
Hypergeometric distribution
You have a collection objects, of which are of type 1, and of which are of type 2. You randomly pick objects from the collection (these are the elementary events). Let the random variable map to the number of objects of type 1 that were picked in . The probability mass function of is called a hypergeometric distribution:
Geometric distribution
Consider the space . Let be a random variable which denotes the iteration where the first 1 is obtained, with the iterations indexed starting from 0. Then,
Note that
Additionally, , or, equivalently, .
Negative binomial distribution
Imagine a sequence of independent Bernoulli trials: each trial has two potential outcomes called “success” and “failure.” In each trial the probability of success is and of failure is . We observe this sequence until successes occur. Then the number of observed failures, , follows the negative binomial (or Pascal) distribution with parameters and :
Can also be written as
We know from Taylor’s theorem that ()
If we plug in and we get
Thus, . The other property being easily verified, we can conclude that is a legitimate probability mass function.
Observe that the geometric density with parameter is a negative binomial density with parameters and .
is a probability mass distribution even when is any positive real number.
Poisson distribution
Let .
Note that
Distribution | Parameters | Support | PGF | ||
---|---|---|---|---|---|
Binomial | (trials),(success prob.) | ||||
Hypergeometric | (population),(successes),(draws) | ||||
Geometric | (success prob.) | ||||
Negative Binomial | (successes),(success prob.) | ||||
Poisson | (rate) |