Skip to main content

Section 7.5 Generating Functions for Bernoulli-based Distributions

Moment Generating Functions 5.5.1 can be derived for each of the distributions in this chapter.

\begin{equation*} M(t) = f(0) + e^t f(1) = (1-p) + p e^t \end{equation*}

Presuming \(e^t (1-p) \lt 1\text{,}\)

\begin{align*} M(t) & = \sum_{x=1}^{\infty} e^{tx} p (1-p)^{x-1}\\ & = \frac{p}{1-p} \sum_{x=1}^{\infty} (e^t (1-p))^x\\ & = \frac{p}{1-p} \frac{e^t (1-p)}{1 - e^t (1-p)}\\ & = \frac{pe^t }{1 - e^t (1-p)}. \end{align*}

where we used the geometric series to convert the sum. The second form comes by dividing through by \(e^t\text{.}\)

\begin{equation*} M(0) = p \frac{e^0 }{1 - e^0 (1-p)} = \frac{p}{1-(1-p)} = 1. \end{equation*}

Using the second form for M(t),

\begin{equation*} M'(t) = \frac{e^{-t} p}{(e^{-t} - (1-p))^2} \end{equation*}

and therefore

\begin{equation*} M'(0) = \frac{p}{(1-(1-p))^2} = \frac{1}{p}. \end{equation*}

Continuing with the second derivative,

\begin{equation*} M''(t) = -\frac{p e^{-t} }{{\left(p + e^{-t} - 1\right)}^2} + \frac{2 p e^{-2t}}{{\left(p + e^{-t} - 1 \right)}^{3}} \end{equation*}

and therefore

\begin{equation*} M''(0) = -\frac{p}{{\left(p + 1 - 1 \right)}^2} + \frac{2 p}{{\left(p + 1 - 1 \right)}^{3}} = -\frac{1}{p} + \frac{2}{p^2} = \frac{1}{p^2} + \frac{1-p}{p^2} \end{equation*}

which is the squared mean plus the variance for the geometric distribution.

For the two uniform distributions and basic Bernoulli, there is really nothing much that can be done as n (or a and b) vary. However, for all distributions that have the opportunity to "scale" larger and larger, we will first determine the M(t) and then demonstrate the a surprising relationship as their variables to grow appropriately.

\begin{align*} M(t) & = \sum_{x=0}^n e^{tx} \binom{n}{x} p^x (1-p)^{n-x} \\ & = \sum_{x=0}^n \binom{n}{x} (pe^t)^x (1-p)^{n-x} \\ & = \left ( p e^t + (1-p) \right )^n \end{align*}

where we used the binomial theorem to simplify the sum.

Notice that the moment generating function for Bernoulli is simply the Binomial moment generating function with n=1.

\begin{equation*} M(0) = \left ( p e^0 + (1-p) \right )^n = 1^n = 1. \end{equation*}

Taking the derivative with respect to t,

\begin{equation*} M'(t) = n \left ( p e^t + (1-p) \right )^{n-1} p e^t \end{equation*}

and evaluating at t=0 gives

\begin{equation*} M'(0) = n \left ( p + (1-p) \right )^{n-1} p = n 1^{n-1} p = np. \end{equation*}

Again, taking another derivative with respect to t,

\begin{equation*} M''(t) = n(n-1) \left ( p e^t + (1-p) \right )^{n-2} p^2 e^{2t} + n \left ( p e^t + (1-p) \right )^{n-1} p e^t \end{equation*}

and evaluating at t=0 gives

\begin{align*} M''(0) & = n(n-1) ( p + (1-p))^{n-2} p^2 + n ( p + (1-p) )^{n-1} p \\ & = n(n-1)p^2 + np = (np)^2 + np - np^2 = np(1-p) + (np)^2. \end{align*}

Using the a previous theorem 7.4.5 justifying the Negative Binomial probability function with \(a = p e^t\) and \(b = 1-p\) and by changing variables to \(u = x-r\) gives

\begin{align*} M(t) & = \sum_{u=0}^{\infty} e^{tu} \binom{u+r - 1}{r-1}(1-p)^{u}p^r\\ & = p^r \sum_{u=0}^{\infty} \binom{u + r - 1}{r-1}(e^t(1-p))^u \\ & = \frac{ p^{r}}{(1 - e^t(1-p))^r} \sum_{u=0}^{\infty} \binom{u + r - 1}{r-1}(1 - e^t(1-p))^r (e^t(1-p))^u \\ & = \frac{p^{r}}{(1 - e^t(1-p))^r} \end{align*}

noting that the last summation is the the sum of a negative binomial probability function over its entire range.

It should be noted one may also rewrite the summation and appeal directly to the Negative Binomial Series 7.4.1 to also prove this result.

Notice that the moment generating function for Geometric is simply the Negative Binomial moment generating function with r=1.

\begin{equation*} M(0) = \frac{(p)^r}{(1 - (1-p))^r} = \frac{p^r}{p^r} = 1. \end{equation*}

Taking the derivative with respect to t,

\begin{equation*} M'(t) = -\frac{{\left(p e^{t} - e^{t} + 1\right)}^{r - 1} {\left(p - 1\right)} p^{r} r e^{t}}{{\left(p e^{t} - e^{t} + 1\right)}^{2 r}} \end{equation*}

and evaluating at t=0 gives

\begin{equation*} M'(0) = -\frac{{p}^{r - 1} {p - 1} p^{r} r }{{p}^{2 r}} = \frac{r(1-p)}{p}. \end{equation*}

OOPS...don't need the (1-p) on top. Again, taking another derivative with respect to t,

\begin{equation*} M''(t) = -\frac{{\left(p e^{t} - e^{t}\right)} {\left(p e^{t} - e^{t} + 1\right)}^{r - 2} {\left(p - 1\right)} p^{r} {\left(r - 1\right)} r e^{t}}{{\left(p e^{t} - e^{t} + 1\right)}^{2r}} \\ + \frac{2 {\left(p e^{t} - e^{t}\right)} {\left(p e^{t} - e^{t} + 1\right)}^{2r - 2} {\left(p - 1\right)} p^{r} r^{2} e^{t}}{{\left(p e^{t} - e^{t} + 1\right)}^{3r}} \\ - \frac{{\left(p e^{t} - e^{t} + 1\right)}^{r - 1} {\left(p - 1\right)} p^{r} r e^{t}}{{\left(p e^{t} - e^{t} + 1\right)}^{2r}} \end{equation*}

and evaluating at t=0 gives

\begin{align*} M''(0) & = \frac{{\left(p^{2 \, r - 1} r - p^{2 \, r - 2} r - p^{2 \, r - 2}\right)} {\left(p - 1\right)} r}{p^{2 \, r}} \\ & = \left ( \frac{r(1-p)}{p^2} \right )^2 + \left ( \frac{r}{p} \right )^2. \end{align*}

Just in case you are wondering about the derivations and especially taking the derivatives above, Sage will do the heavy lifting for you.