Skip to main content

Section 5.5 Generating Functions

In the previous section, you were able to take a given probability distribution and create several "special" expected value metrics that provided useful or interesting facts about the particular distribution...such as the mean and variance. It is possible however to embed a lot of these metrics in a "generating function".

Definition 5.5.1. Moment Generating Function.

Given a probability function \(f(x)\text{,}\) the moment generating function is a transformation given by

\begin{equation*} M(t) = E[e^{tx}] \end{equation*}

where the expected value is a summation or integral dependent upon the nature of the random variable x. If the expected value does not exist (due perhaps to a f(x) with asymptotes) then the M(t) does not exist.

Definition 5.5.2. Probability Generating Function.

Given a probability function \(f(x)\text{,}\) the probability generating function is a transformation given by

\begin{equation*} N(t) = E[t^x] \end{equation*}

where the expected value is a summation or integral dependent upon the nature of the random variable x and again presuming that the resulting expected value exists.

Obvious? :)

For the first result, notice

\begin{equation*} M(0) = E[e^{0}] = E[1] = 1 \end{equation*}

is pretty trivial.

For the next results, take derivatives and use the linearity of the summation or integral. Here, let's consider the case where \(X\) is a continuous variable and leave the discrete case for you.

\begin{equation*} M'(t) = D_t \left [ \int_R e^{tx} f(x) dx \right ] = \int_R x e^{tx} f(x) dx \end{equation*}

and then evaluating at t=0 gives

\begin{equation*} M'(0) = \int_R x e^{0} f(x) dx = \mu . \end{equation*}

Continuing,

\begin{equation*} M''(t) = D_t \left [ \int_R x e^{tx} f(x) dx \right ] = \int_R x^2 e^{tx} f(x) dx \end{equation*}

and evaluating at t=0 gives

\begin{equation*} M''(0) = \int_R x^2 e^{0} f(x) dx = E[x^2] = \sigma^2 + \mu^2. \end{equation*}

It should be noted that one may also determine the skewness and the kurtosis in a similar manner.

Notice, in the discrete case where \(R\) = {0, 1, 2, ... }

\begin{equation*} N(t) = E[t^x] = \sum_R t^x f(x) = f(0) + t f(1) + t^2 f(x) + t^3 f(3) + ... \end{equation*}

Therefore, by continually taking derivatives with respect to t and then evaluating those derivatives at t=0,

\begin{gather*} N(0) = f(0)\\ N'(0) = f(1)\\ N''(0) = 2 f(2)\\ N'''(0) = 6 f(3) \end{gather*}

etc.

Apply previous result and notice that both variables have the same probability function \(f(x)\text{.}\)