Skip to main content

Section 8.5 Generating Functions for Poisson Process Distributions

Moment Generating Functions 5.5.1 can be derived for each of the distributions in this chapter.
Using the Poisson probability function
\begin{align*} M(t) & = \sum_{x=0}^{\infty} e^{tx} \frac{\mu^x e^{-\mu}}{x!}\\ & = \sum_{x=0}^{\infty} \frac{\left (\mu e^t \right )^x e^{-\mu e^t} e^{\mu e^t} e^{-\mu} }{x!}\\ & = e^{\mu e^t} e^{-\mu} \sum_{x=0}^{\infty} \frac{\left (\mu e^t \right )^x e^{-\mu e^t} }{x!}\\ & = e^{\mu \left( e^t - 1 \right )} \sum_{x=0}^{\infty} \frac{\left (\mu e^t \right )^x e^{-\mu e^t} }{x!}\\ & = e^{\mu \left( e^t - 1 \right )}, \end{align*}
where we used a new poisson distribution with new mean \(\mu e^t\) to convert the sum.
\begin{equation*} M(0) = e^{\mu \left ( e^0 - 1 \right )} = e^0 = 1. \end{equation*}
Continuing,
\begin{equation*} M'(t) = \mu e^{\left(\mu {\left(e^{t} - 1\right)} + t\right)} \end{equation*}
and therefore
\begin{equation*} M'(0) = \mu e^{\left(\mu {\left(1 - 1\right)} + 0\right)} = \mu e^0 = \mu. \end{equation*}
Continuing with the second derivative,
\begin{equation*} M''(t) = {\left(\mu e^{t} + 1\right)} \mu e^{\left(\mu {\left(e^{t} - 1\right)} + t\right)} \end{equation*}
and therefore
\begin{equation*} M''(0) = {\left(\mu + 1\right)} \mu e^{\left(\mu {\left(1 - 1\right)} + 0\right)} = {\left(\mu + 1\right)} \mu e^0 = \mu + \mu^2 \end{equation*}
which is the squared mean plus the variance for the poisson distribution.
\begin{align*} M(t) & = \int_0^{\infty} e^{tx} \frac{1}{\mu} e^{-\frac{x}{\mu}} dx\\ & = \frac{1}{\mu} \int_0^{\infty} e^{- x \left ( -t + \frac{1}{\mu} \right ) } dx\\ & = \frac{1}{\mu \left(-t + \frac{1}{\mu} \right )} e^{- x \left ( -t + \frac{1}{\mu} \right ) } \big |_0^{\infty}\\ & = \frac{1}{\mu \left(-t + \frac{1}{\mu} \right )} \left ( -0 + 1 \right )\\ & = \frac{1}{\left(-\mu t + 1 \right )}. \end{align*}
\begin{equation*} M(0) = \frac{1}{1-\mu 0} = 1. \end{equation*}
Continuing,
\begin{equation*} M'(t) = \frac{\mu}{ \left ( 1-\mu t \right )^2} \end{equation*}
and therefore
\begin{equation*} M'(0) = \frac{\mu}{ \left ( 1-\mu 0 \right )^2} = \mu. \end{equation*}
Continuing with the second derivative,
\begin{equation*} M''(t) = \frac{2 \mu^2}{ \left ( 1-\mu t \right )^3} \end{equation*}
and therefore
\begin{equation*} M''(0) = \frac{2 \mu^2}{ \left ( 1-\mu 0 \right )^3}= 2 \mu^2 = \mu^2 + \mu^2. \end{equation*}
which is the squared mean plus the variance for the poisson distribution.
\begin{align*} M(t) & = \int_0^{\infty} e^{tx} \frac{x^{r-1} \cdot e^{-\frac{x}{\mu}}}{\Gamma(r) \cdot \mu^r} dx\\ M(t) & = \int_0^{\infty} \frac{x^{r-1} \cdot e^{-x \left ( \frac{1}{\mu} - t \right )}}{\Gamma(r) \mu^r \left ( \frac{1}{\mu} - t \right )} dx\\ M(t) & = { \frac{1}{\left ( 1-t \mu \right )^r} } \int_0^{\infty} \frac{x^{r-1} \cdot e^{-\frac{x}{ \left ( \frac{\mu}{1-t \mu} \right )}}}{\Gamma(r) \cdot { \left ( \frac{\mu}{1-t \mu} \right )}^r} dx\\ & = \frac{1}{\left(-\mu t + 1 \right )^{r}}. \end{align*}
since the last integral is on the Gamma probability function but with an adjusted mean.
\begin{equation*} M(0) = \frac{1}{ \left ( 1-\mu 0 \right )^{r}} \frac{1}{1} = 1. \end{equation*}
Continuing,
\begin{equation*} M'(t) = \frac{r \mu}{ \left ( 1-\mu t \right )^{r+1}} \end{equation*}
and therefore
\begin{equation*} M'(0) = \frac{r \mu}{ \left ( 1-\mu 0 \right )^{r+1}} = r \mu. \end{equation*}
Continuing with the second derivative,
\begin{equation*} M''(t) = \frac{r(r+1) \mu^2}{ \left ( 1-\mu t \right )^{r+2}} \end{equation*}
and therefore
\begin{equation*} M''(0) = \frac{r(r+1) \mu^2}{ \left ( 1-\mu 0 \right )^{r+2}} = r(r+1) \mu^2 = r \mu^2 + r^2 \mu^2 \end{equation*}
which is the squared mean plus the variance for the poisson distribution.
Once again, Sage can obtain the final answers quickly. For Poisson: