Skip to main content

Section 7.3 Geometric Distribution

Consider the situation where one can observe a sequence of independent trials where the likelihood of a success on each individual trial stays constant from trial to trial. Call this likelihood the probably of "success" and denote its value by \(p\) where \(0 \lt p \lt 1 \text{.}\) If we let the variable \(X\) measure the number of trials needed in order to obtain the first success with \(R = \{1, 2, 3, ... \}\text{,}\) then the resulting distribution of probabilities is called a Geometric Distribution.

Since successive trials are independent, then the probability of the first success occurring on the mth trial presumes that the previous m-1 trials were all failures. Therefore the desired probability is given by

\begin{equation*} f(x) = P(X = x) = P(FF...FS) = (1-p)^{x-1}p \end{equation*}
\begin{gather*} \sum_{x=1}^{\infty} {f(x)} = \sum_{x=1}^{\infty} {(1-p)^{x-1} p} = p \sum_{j=0}^{\infty} {(1-p)^j} = p \frac{1}{1-(1-p)} = 1 \end{gather*}
\begin{align*} \mu & = E[X] = \sum_{k=0}^{\infty} {k(1-p)^{k-1}p}\\ & = p \sum_{k=1}^{\infty} {k(1-p)^{k-1}}\\ & = p \frac{1}{(1-(1-p))^2}\\ & = p \frac{1}{p^2} = \frac{1}{p} \end{align*}
\begin{align*} \sigma^2 & = E[X(X-1)] + \mu - \mu^2 \\ & = \sum_{k=0}^{\infty} {k(k-1)(1-p)^{k-1}p} + \mu - \mu^2 \\ & = (1-p)p \sum_{k=2}^{\infty} {k(k-1)(1-p)^{k-2}} + \frac{1}{p} - \frac{1}{p^2}\\ & = (1-p)p \frac{2}{(1-(1-p))^3} + \frac{1}{p} - \frac{1}{p^2}\\ & = \frac{1-p}{p^2} \end{align*}

Consider the accumulated probabilities over a range of values...

\begin{align*} P(X \le x) & = 1 - P(X \gt x)\\ & = 1- \sum_{k={x+1}}^{\infty} {(1-p)^{k-1}p}\\ & = 1- p \frac{(1-p)^{x}}{1-(1-p)}\\ & = 1- (1-p)^{x} \end{align*}

Using the definition of conditional probability,

\begin{align*} P( X > a + b | X > b ) & = P( X > a + b \cap X > b ) \ P( X > b)\\ & = P( X > a + b ) / P( X > b)\\ & = (1-p)^{a+b} / (1-p)^b\\ & = (1-p)^a\\ & = P(X > a) \end{align*}