Distributions¶
Geometric Distribution¶
The geometric distribution frequently occurs in applications because we are often interested in how long we have to wait before a certain event happens.
A random variable X for which
is said to have the geometric distribution with parameter p, abbreviated as X∼Geometric(p).
As a santiy check, we can verify the total probability of X is equal to 1:
If we plot the distribution of X, we get a curve that decreases monotonically by a factor of 1−p at each step.
Mean and Variance¶
Let X be a random variable that takes in values {0,1,2,⋯}. Then,
Below is the proof of what we call the Tail Sum Formula.
Proof
For notational convenience, let’s write pi=P[X=i], for i=0,1,2,⋯.
In compact notation:
For X∼Geometric(p), we have E[X]=1p.
Proof
The key observation is for a geometric random variable X,
We can obtain this simply by summing P[X=j] for j≥i. Another way of seeing this is to note that the event X≥i means at least i tosses are required, or the first i−1 tosses are all tails and the probability of this event is (1−p)i−1.
And for X∼Geometric(p), we have Var(X)=1−pp2.
Proof
We will show E[X(X−1)]=2(1−p)p2:
To show E[X(X−1)]=2(1−p)p2:
Poisson Distribution¶
A random variable X for which
is said to have the Poisson distribution with parameter λ, abbreviated as X∼Poisson(λ).
As a sanity check,
where in the second to last step, we used the Taylor expansion ex=1+x+x22!+⋯.
The Poisson distribution is used as a widely accepted model for so-called “rare events”, such as car accidents or ICU admission at a hospital. It is appropriate whenever the occurrences can be assumed to happen randomly with some constant density in a continuous region (of time or space), such that occurrences in disjoint subregions are independent.
Mean and Variance¶
For a Poisson random variable X with parameter λ, we have E[X]=Var(X)=λ,
Proof
Similarly,
Therefore,
Sum of Independent Poisson Random Variables¶
Theorem
Let X∼Poisson(λ) and Y∼Poisson(μ) be independent Poisson random variables. Then, X+Y∼Poisson(λ+μ).
Proof
For all k=0,1,2,⋯, we have
where the second equality follows from independence, and the last equality from the binomial theorem.
By induction, we can conclude for X1,X2,⋯,Xn independent Poisson random variables with parameters λ1,⋯,λn.,
Poisson as a Limit of Binomial¶
Theorem
Let X∼Binomial(n,λn) where λ>0 is a fixed constant. For every i=0,1,2,⋯,
Proof
Fix i∈{0,1,2,⋯}, and assume n≥i (because n→∞). Then, because X is a binomial distribution,
Collecting the factors,
The first parenthesis above, as n\rightarrow\infty, becomes
The second partnhesis becomes
Since i is fixed,
Substituting these into the above equation,