Section 7.3 Geometric Distribution
Consider the situation where one can observe a sequence of independent trials where the likelihood of a success on each individual trial stays constant from trial to trial. Call this likelihood the probably of "success" and denote its value by \(p\) where \(0 \lt p \lt 1 \text{.}\) If we let the variable \(X\) measure the number of trials needed in order to obtain the first success with \(R = \{1, 2, 3, ... \}\text{,}\) then the resulting distribution of probabilities is called a Geometric Distribution.
Since successive trials are independent, then the probability of the first success occurring on the mth trial presumes that the previous m-1 trials were all failures. Therefore the desired probability is given by
\begin{equation*}
f(x) = P(X = x) = P(FF...FS) = (1-p)^{x-1}p
\end{equation*}
Theorem 7.3.1 Geometric Distribution sums to 1
\begin{equation*}
f(x) = (1-p)^{x-1}p
\end{equation*}
sums to 1 over \(R = \{ 1, 2, ... \}\)Proof
\begin{gather*}
\sum_{x=1}^{\infty} {f(x)} = \sum_{x=1}^{\infty} {(1-p)^{x-1} p} = p \sum_{j=0}^{\infty} {(1-p)^j} = p \frac{1}{1-(1-p)} = 1
\end{gather*}
Theorem 7.3.2 Geometric Mean
For the geometric distribution,
\begin{equation*}
\mu = 1/p
\end{equation*}
Proof
\begin{align*}
\mu & = E[X] = \sum_{k=0}^{\infty} {k(1-p)^{k-1}p}\\
& = p \sum_{k=1}^{\infty} {k(1-p)^{k-1}}\\
& = p \frac{1}{(1-(1-p))^2}\\
& = p \frac{1}{p^2} = \frac{1}{p}
\end{align*}
Theorem 7.3.3 Geometric Variance
For the geometric distribution
\begin{equation*}
\sigma^2 = \frac{1-p}{p^2}
\end{equation*}
Proof
\begin{align*}
\sigma^2 & = E[X(X-1)] + \mu - \mu^2 \\
& = \sum_{k=0}^{\infty} {k(k-1)(1-p)^{k-1}p} + \mu - \mu^2 \\
& = (1-p)p \sum_{k=2}^{\infty} {k(k-1)(1-p)^{k-2}} + \frac{1}{p} - \frac{1}{p^2}\\
& = (1-p)p \frac{2}{(1-(1-p))^3} + \frac{1}{p} - \frac{1}{p^2}\\
& = \frac{1-p}{p^2}
\end{align*}
Theorem 7.3.4 Geometric Distribution Function
\begin{equation*}
F(x) = 1- (1-p)^{x}
\end{equation*}
Proof
Consider the accumulated probabilities over a range of values...
\begin{align*}
P(X \le x) & = 1 - P(X \gt x)\\
& = 1- \sum_{k={x+1}}^{\infty} {(1-p)^{k-1}p}\\
& = 1- p \frac{(1-p)^{x}}{1-(1-p)}\\
& = 1- (1-p)^{x}
\end{align*}
Theorem 7.3.5 Statistics for Geometric Distribution
Mean, Variance, Skewness, Kurtosis computed by Sage.
Theorem 7.3.6 The Geometric Distribution yields a memoryless model.
If X has a geometric distribution and a and b are nonnegative integers, then
\begin{equation*}
P( X > a + b | X > b ) = P( X > a)
\end{equation*}
Proof
Using the definition of conditional probability,
\begin{align*}
P( X > a + b | X > b ) & = P( X > a + b \cap X > b ) \ P( X > b)\\
& = P( X > a + b ) / P( X > b)\\
& = (1-p)^{a+b} / (1-p)^b\\
& = (1-p)^a\\
& = P(X > a)
\end{align*}