Distributions¶
Geometric Distribution¶
The geometric distribution frequently occurs in applications because we are often interested in how long we have to wait before a certain event happens.
A random variable \(X\) for which
is said to have the geometric distribution with parameter \(p\), abbreviated as \(X\sim Geometric(p)\).
As a santiy check, we can verify the total probability of \(X\) is equal to 1:
If we plot the distribution of \(X\), we get a curve that decreases monotonically by a factor of \(1-p\) at each step.
Mean and Variance¶
Let \(X\) be a random variable that takes in values \(\{0,1,2,\cdots\}\). Then,
Below is the proof of what we call the Tail Sum Formula.
Proof
For notational convenience, let’s write \(p_i=\mathbb{P}[X=i]\), for \(i=0,1,2, \cdots\).
In compact notation:
For \(X\sim Geometric(p)\), we have \(\mathbb{E}[X]=\frac{1}{p}\).
Proof
The key observation is for a geometric random variable \(X\),
We can obtain this simply by summing \(\mathbb{P}[X=j]\) for \(j\geq i\). Another way of seeing this is to note that the event \(X\geq i\) means at least \(i\) tosses are required, or the first \(i-1\) tosses are all tails and the probability of this event is \((1-p)^{i-1}\).
And for \(X\sim Geometric(p)\), we have \(Var(X)=\frac{1-p}{p^2}\).
Proof
We will show \(\mathbb{E}[X(X-1)]=\frac{2(1-p)}{p^2}\):
To show \(\mathbb{E}[X(X-1)]=\frac{2(1-p)}{p^2}\):
Poisson Distribution¶
A random variable \(X\) for which
is said to have the Poisson distribution with parameter \(\lambda\), abbreviated as \(X\sim Poisson(\lambda)\).
As a sanity check,
where in the second to last step, we used the Taylor expansion \(e^x=1+x+\frac{x ^2}{2!}+\cdots\).
The Poisson distribution is used as a widely accepted model for so-called “rare events”, such as car accidents or ICU admission at a hospital. It is appropriate whenever the occurrences can be assumed to happen randomly with some constant density in a continuous region (of time or space), such that occurrences in disjoint subregions are independent.
Mean and Variance¶
For a Poisson random variable \(X\) with parameter \(\lambda\), we have \(\mathbb{E} [X]=Var(X)=\lambda\),
Proof
Similarly,
Therefore,
Sum of Independent Poisson Random Variables¶
Theorem
Let \(X\sim Poisson(\lambda)\) and \(Y\sim Poisson(\mu)\) be independent Poisson random variables. Then, \(X+Y\sim Poisson(\lambda+\mu)\).
Proof
For all \(k=0,1,2,\cdots\), we have
where the second equality follows from independence, and the last equality from the binomial theorem.
By induction, we can conclude for \(X_1,X_2,\cdots,X_n\) independent Poisson random variables with parameters \(\lambda_1,\cdots,\lambda_n\).,
Poisson as a Limit of Binomial¶
Theorem
Let \(X\sim Binomial(n,\frac{\lambda}{n})\) where \(\lambda>0\) is a fixed constant. For every \(i=0,1,2,\cdots\),
Proof
Fix \(i\in\{0, 1, 2, \cdots \}\), and assume \(n\geq i\) (because \(n\rightarrow \infty\)). Then, because \(X\) is a binomial distribution,
Collecting the factors,
The first parenthesis above, as \(n\rightarrow\infty\), becomes
The second partnhesis becomes
Since \(i\) is fixed,
Substituting these into the above equation,