Section 8.5 Generating Functions for Poisson Process Distributions
Theorem 8.5.1. Moment Generating Function for Poisson.
Presuming \(t \gt 0\) and
\begin{equation*}
M(t) = e^{\mu \left ( e^t - 1 \right )}
\end{equation*}
Proof.
Using the Poisson probability function
\begin{align*}
M(t) & = \sum_{x=0}^{\infty} e^{tx} \frac{\mu^x e^{-\mu}}{x!}\\
& = \sum_{x=0}^{\infty} \frac{\left (\mu e^t \right )^x e^{-\mu e^t} e^{\mu e^t} e^{-\mu} }{x!}\\
& = e^{\mu e^t} e^{-\mu} \sum_{x=0}^{\infty} \frac{\left (\mu e^t \right )^x e^{-\mu e^t} }{x!}\\
& = e^{\mu \left( e^t - 1 \right )} \sum_{x=0}^{\infty} \frac{\left (\mu e^t \right )^x e^{-\mu e^t} }{x!}\\
& = e^{\mu \left( e^t - 1 \right )},
\end{align*}
where we used a new poisson distribution with new mean \(\mu e^t\) to convert the sum.
Corollary 8.5.2. Poisson Properties via Moment Generating Function.
For the Poisson variable X,
\begin{equation*}
M(0) = 1
\end{equation*}
\begin{equation*}
M'(0) = \mu
\end{equation*}
\begin{equation*}
M''(0) = \mu + \mu^2 = \sigma^2 + \mu^2
\end{equation*}
Proof.
\begin{equation*}
M(0) = e^{\mu \left ( e^0 - 1 \right )} = e^0 = 1.
\end{equation*}
Continuing,
\begin{equation*}
M'(t) = \mu e^{\left(\mu {\left(e^{t} - 1\right)} + t\right)}
\end{equation*}
and therefore
\begin{equation*}
M'(0) = \mu e^{\left(\mu {\left(1 - 1\right)} + 0\right)} = \mu e^0 = \mu.
\end{equation*}
Continuing with the second derivative,
\begin{equation*}
M''(t) = {\left(\mu e^{t} + 1\right)} \mu e^{\left(\mu {\left(e^{t} - 1\right)} + t\right)}
\end{equation*}
and therefore
\begin{equation*}
M''(0) = {\left(\mu + 1\right)} \mu e^{\left(\mu {\left(1 - 1\right)} + 0\right)} = {\left(\mu + 1\right)} \mu e^0 = \mu + \mu^2
\end{equation*}
which is the squared mean plus the variance for the poisson distribution.
Theorem 8.5.3. Moment Generating Function for Exponential.
Presuming \(t \lt \frac{1}{\mu}\text{,}\)
\begin{equation*}
M(t) = \frac{1}{1-\mu t}.
\end{equation*}
Proof.
\begin{align*}
M(t) & = \int_0^{\infty} e^{tx} \frac{1}{\mu} e^{-\frac{x}{\mu}} dx\\
& = \frac{1}{\mu} \int_0^{\infty} e^{- x \left ( -t + \frac{1}{\mu} \right ) } dx\\
& = \frac{1}{\mu \left(-t + \frac{1}{\mu} \right )} e^{- x \left ( -t + \frac{1}{\mu} \right ) } \big |_0^{\infty}\\
& = \frac{1}{\mu \left(-t + \frac{1}{\mu} \right )} \left ( -0 + 1 \right )\\
& = \frac{1}{\left(-\mu t + 1 \right )}.
\end{align*}
Corollary 8.5.4. Exponential Properties via Moment Generating Function.
For the Exponential variable X,
\begin{equation*}
M(0) = 1
\end{equation*}
\begin{equation*}
M'(0) = \mu
\end{equation*}
\begin{equation*}
M''(0) = \mu^2 + \mu^2 = \sigma^2 + \mu^2
\end{equation*}
Proof.
\begin{equation*}
M(0) = \frac{1}{1-\mu 0} = 1.
\end{equation*}
Continuing,
\begin{equation*}
M'(t) = \frac{\mu}{ \left ( 1-\mu t \right )^2}
\end{equation*}
and therefore
\begin{equation*}
M'(0) = \frac{\mu}{ \left ( 1-\mu 0 \right )^2} = \mu.
\end{equation*}
Continuing with the second derivative,
\begin{equation*}
M''(t) = \frac{2 \mu^2}{ \left ( 1-\mu t \right )^3}
\end{equation*}
and therefore
\begin{equation*}
M''(0) = \frac{2 \mu^2}{ \left ( 1-\mu 0 \right )^3}= 2 \mu^2 = \mu^2 + \mu^2.
\end{equation*}
which is the squared mean plus the variance for the poisson distribution.
Theorem 8.5.5. Moment Generating Function for Gamma.
Presuming \(t \lt \frac{1}{\mu}\) where \(\mu\) is the mean waiting time till the first "change" and \(r\) is the number of changes desired,
\begin{equation*}
M(t) = \frac{1}{ \left ( 1-\mu t \right )^{r}}
\end{equation*}
Proof.
\begin{align*}
M(t) & = \int_0^{\infty} e^{tx} \frac{x^{r-1} \cdot e^{-\frac{x}{\mu}}}{\Gamma(r) \cdot \mu^r} dx\\
M(t) & = \int_0^{\infty} \frac{x^{r-1} \cdot e^{-x \left ( \frac{1}{\mu} - t \right )}}{\Gamma(r) \mu^r \left ( \frac{1}{\mu} - t \right )} dx\\
M(t) & = { \frac{1}{\left ( 1-t \mu \right )^r} } \int_0^{\infty} \frac{x^{r-1} \cdot e^{-\frac{x}{ \left ( \frac{\mu}{1-t \mu} \right )}}}{\Gamma(r) \cdot { \left ( \frac{\mu}{1-t \mu} \right )}^r} dx\\
& = \frac{1}{\left(-\mu t + 1 \right )^{r}}.
\end{align*}
since the last integral is on the Gamma probability function but with an adjusted mean.
Corollary 8.5.6. Gamma Properties via Moment Generating Function.
For the Gamma variable X,
\begin{equation*}
M(0) = 1
\end{equation*}
\begin{equation*}
M'(0) = r \mu
\end{equation*}
\begin{equation*}
M''(0) = r \mu^2 + \left ( r \mu \right )^2 = \sigma^2 + \mu^2
\end{equation*}
Proof.
\begin{equation*}
M(0) = \frac{1}{ \left ( 1-\mu 0 \right )^{r}} \frac{1}{1} = 1.
\end{equation*}
Continuing,
\begin{equation*}
M'(t) = \frac{r \mu}{ \left ( 1-\mu t \right )^{r+1}}
\end{equation*}
and therefore
\begin{equation*}
M'(0) = \frac{r \mu}{ \left ( 1-\mu 0 \right )^{r+1}} = r \mu.
\end{equation*}
Continuing with the second derivative,
\begin{equation*}
M''(t) = \frac{r(r+1) \mu^2}{ \left ( 1-\mu t \right )^{r+2}}
\end{equation*}
and therefore
\begin{equation*}
M''(0) = \frac{r(r+1) \mu^2}{ \left ( 1-\mu 0 \right )^{r+2}} = r(r+1) \mu^2 = r \mu^2 + r^2 \mu^2
\end{equation*}
which is the squared mean plus the variance for the poisson distribution.
Once again, Sage can obtain the final answers quickly. For Poisson: