Section 8.3 Exponential Distribution
Once again, consider a Poisson Process where you start with an interval of variable length X so that X measures the interval needed in order to obtain a first success with \(R = (0,\infty)\text{.}\) The resulting distribution of X will be called an Exponential distribution.
To derive the probability function for this distribution, consider finding f(x) by first considering F(x). This gives
\begin{align*}
F(x)& = P(X \le x)\\
& = 1 - P(X \gt x)\\
& = 1 - P(\text{first change occurs after an interval of length x})\\
& = 1 - P(\text{no changes in the interval [0,x]})\\
& = 1 - \frac{(\lambda x)^0 e^{-\lambda x}}{0!}\\
& = 1 - e^{-\lambda x}
\end{align*}
where the discrete Poisson Probability Function is used to answer the probability of exactly no changes in the "fixed" interval [0,x]. Using this distribution function and taking the derivative yields
\begin{equation*}
f(x) = F'(x) = \lambda e^{-\lambda x}.
\end{equation*}
Definition 8.3.1 Exponential Distribution Probability Function
Given a Poisson process and a constant \(\mu\text{,}\) suppose X measures the variable interval length needed until you get a first success. Then X has an exponential distribution with probability function
\begin{equation*}
f(x) = \frac{1}{\mu} e^{-\frac{x}{\mu}}.
\end{equation*}
Theorem 8.3.2 Verification of Exponential Probability Function
\begin{equation*}
\int_0^{\infty} \frac{1}{\mu} e^{-\frac{x}{\mu}} dx = 1
\end{equation*}
Proof
\begin{align*}
& \int_0^{\infty} \frac{1}{\mu} e^{-\frac{x}{\mu}} dx\\
& = \int_0^{\infty} e^{-u} dx\\
& = -e^{-u} \big |_0^{\infty} = 1
\end{align*}
Theorem 8.3.3 Distribution function for Exponential Distribution
\begin{equation*}
F(x) = 1 - e^{-\frac{x}{\mu}}
\end{equation*}
Proof
Using \(f(x) = \frac{1}{\mu} e^{-\frac{x}{\mu}}\text{,}\) note
\begin{align*}
F(x) & = \int_0^x \frac{1}{\mu} e^{-\frac{u}{\mu}} du\\
& = - e^{-\frac{u}{\mu}} \big |_0^x\\
& = 1 - e^{-\frac{x}{\mu}}
\end{align*}
Theorem 8.3.4 Derivation of Statistics for Exponential Distribution and Plotting
\begin{equation*}
\sigma^2 = \mu^2
\end{equation*}
\begin{equation*}
\gamma_1 = 2
\end{equation*}
\begin{equation*}
\gamma_2 = 9
\end{equation*}
Proof
For the mean, notice that
\begin{align*}
\text{Mean} & = \int_0^{\infty} x \cdot \frac{1}{\mu} e^{-\frac{x}{\mu}} \\
& = [ (1-x) e^{-\frac{x}{\mu}} ] \big |_0^{\infty} = \mu
\end{align*}
and so the use of \(\mu\) in f(x) is warranted.
The remaining statistics are derived similarly using repeated integration by parts. The interactive Sage cell below calculates those for you automatically.
Theorem 8.3.5 The Exponential Distribution yields a continuous memoryless model.
If X has an exponential distribution and a and b are nonnegative integers, then
\begin{equation*}
P( X > a + b | X > b ) = P( X > a)
\end{equation*}
Proof
Using the definition of conditional probability,
\begin{align*}
P( X > a + b | X > b ) & = P( X > a + b \cap X > b ) \ P( X > b)\\
& = P( X > a + b ) / P( X > b)\\
& = e^{-(a+b)/ \mu} / e^{-b / \mu}\\
& = e^{-a/ \mu}\\
& = P(X > a)
\end{align*}