Skip to main content

Section 7.5 Generating Functions for Bernoulli-based Distributions

Moment Generating Functions 5.5.1 can be derived for each of the distributions in this chapter.
M(t)=f(0)+etf(1)=(1βˆ’p)+pet
Presuming et(1βˆ’p)<1,
M(t)=βˆ‘x=1∞etxp(1βˆ’p)xβˆ’1=p1βˆ’pβˆ‘x=1∞(et(1βˆ’p))x=p1βˆ’pet(1βˆ’p)1βˆ’et(1βˆ’p)=pet1βˆ’et(1βˆ’p).
where we used the geometric series to convert the sum. The second form comes by dividing through by et.
M(0)=pe01βˆ’e0(1βˆ’p)=p1βˆ’(1βˆ’p)=1.
Using the second form for M(t),
Mβ€²(t)=eβˆ’tp(eβˆ’tβˆ’(1βˆ’p))2
and therefore
Mβ€²(0)=p(1βˆ’(1βˆ’p))2=1p.
Continuing with the second derivative,
Mβ€³(t)=βˆ’peβˆ’t(p+eβˆ’tβˆ’1)2+2peβˆ’2t(p+eβˆ’tβˆ’1)3
and therefore
Mβ€³(0)=βˆ’p(p+1βˆ’1)2+2p(p+1βˆ’1)3=βˆ’1p+2p2=1p2+1βˆ’pp2
which is the squared mean plus the variance for the geometric distribution.
For the two uniform distributions and basic Bernoulli, there is really nothing much that can be done as n (or a and b) vary. However, for all distributions that have the opportunity to "scale" larger and larger, we will first determine the M(t) and then demonstrate the a surprising relationship as their variables to grow appropriately.
M(t)=βˆ‘x=0netx(nx)px(1βˆ’p)nβˆ’x=βˆ‘x=0n(nx)(pet)x(1βˆ’p)nβˆ’x=(pet+(1βˆ’p))n
where we used the binomial theorem to simplify the sum.
Notice that the moment generating function for Bernoulli is simply the Binomial moment generating function with n=1.
M(0)=(pe0+(1βˆ’p))n=1n=1.
Taking the derivative with respect to t,
Mβ€²(t)=n(pet+(1βˆ’p))nβˆ’1pet
and evaluating at t=0 gives
Mβ€²(0)=n(p+(1βˆ’p))nβˆ’1p=n1nβˆ’1p=np.
Again, taking another derivative with respect to t,
Mβ€³(t)=n(nβˆ’1)(pet+(1βˆ’p))nβˆ’2p2e2t+n(pet+(1βˆ’p))nβˆ’1pet
and evaluating at t=0 gives
Mβ€³(0)=n(nβˆ’1)(p+(1βˆ’p))nβˆ’2p2+n(p+(1βˆ’p))nβˆ’1p=n(nβˆ’1)p2+np=(np)2+npβˆ’np2=np(1βˆ’p)+(np)2.
Using the a previous theorem 7.4.5 justifying the Negative Binomial probability function with a=pet and b=1βˆ’p and by changing variables to u=xβˆ’r gives
M(t)=βˆ‘u=0∞etu(u+rβˆ’1rβˆ’1)(1βˆ’p)upr=prβˆ‘u=0∞(u+rβˆ’1rβˆ’1)(et(1βˆ’p))u=pr(1βˆ’et(1βˆ’p))rβˆ‘u=0∞(u+rβˆ’1rβˆ’1)(1βˆ’et(1βˆ’p))r(et(1βˆ’p))u=pr(1βˆ’et(1βˆ’p))r
noting that the last summation is the the sum of a negative binomial probability function over its entire range.
It should be noted one may also rewrite the summation and appeal directly to the Negative Binomial Series 7.4.1 to also prove this result.
Notice that the moment generating function for Geometric is simply the Negative Binomial moment generating function with r=1.
M(0)=(p)r(1βˆ’(1βˆ’p))r=prpr=1.
Taking the derivative with respect to t,
Mβ€²(t)=βˆ’(petβˆ’et+1)rβˆ’1(pβˆ’1)prret(petβˆ’et+1)2r
and evaluating at t=0 gives
Mβ€²(0)=βˆ’prβˆ’1pβˆ’1prrp2r=r(1βˆ’p)p.
OOPS...don’t need the (1-p) on top. Again, taking another derivative with respect to t,
Mβ€³(t)=βˆ’(petβˆ’et)(petβˆ’et+1)rβˆ’2(pβˆ’1)pr(rβˆ’1)ret(petβˆ’et+1)2r+2(petβˆ’et)(petβˆ’et+1)2rβˆ’2(pβˆ’1)prr2et(petβˆ’et+1)3rβˆ’(petβˆ’et+1)rβˆ’1(pβˆ’1)prret(petβˆ’et+1)2r
and evaluating at t=0 gives
Mβ€³(0)=(p2rβˆ’1rβˆ’p2rβˆ’2rβˆ’p2rβˆ’2)(pβˆ’1)rp2r=(r(1βˆ’p)p2)2+(rp)2.
Just in case you are wondering about the derivations and especially taking the derivatives above, Sage will do the heavy lifting for you.