Skip to main content ☰ Contents Index You! < Prev ^ Up Next > \(
\newcommand{\lt}{<}
\newcommand{\gt}{>}
\newcommand{\amp}{&}
\definecolor{fillinmathshade}{gray}{0.9}
\newcommand{\fillinmath}[1]{\mathchoice{\colorbox{fillinmathshade}{$\displaystyle \phantom{\,#1\,}$}}{\colorbox{fillinmathshade}{$\textstyle \phantom{\,#1\,}$}}{\colorbox{fillinmathshade}{$\scriptstyle \phantom{\,#1\,}$}}{\colorbox{fillinmathshade}{$\scriptscriptstyle\phantom{\,#1\,}$}}}
\)
Section 6.5 Generating Functions for Uniform-based Distributions
Theorem 6.5.1 . Moment Generating Function for Discrete Uniform.
For a discrete random variable X on the space \(R\) = {1, 2, ..., n},
\begin{equation*}
M(t) = \frac{1}{n} \cdot \left [ e^t + e^{2t} + ... e^{nt} \right ]
\end{equation*}
Proof.
Presuming \(R\) = {1, 2, ..., n},
\begin{equation*}
M(t) = \sum_{x=1}^n e^{tx}/n = \frac{1}{n} \cdot \left [ e^t + e^{2t} + ... e^{nt} \right ]
\end{equation*}
Theorem 6.5.2 . Moment Generating Function for Continuous Uniform.
For a continuous random variable X on the space \(R = [a,b]\text{,}\)
\begin{equation*}
M(t) = \frac{e^{bx} - e^{ax}}{b-a}
\end{equation*}
Proof.
Presuming \(R\) = [a,b],
\begin{equation*}
M(t) = \int_a^b e^{tx} \frac{1}{b-a} dx = \frac{1}{b-a} \frac{1}{t} e^{tx} \big |_a^b = \frac{e^{bx} - e^{ax}}{b-a}
\end{equation*}
Theorem 6.5.3 . Moment Generating Function for Hypergeometric.
For a hypergeometric random variable over the space \(R\) = {0, 1, ..., min(\(r,n_1\) )}.
\begin{equation*}
M(t) = \sum ( e^{tx} \frac{\binom{n_1}{x} \binom{n-n_1}{r-x}}{\binom{n}{r}}).
\end{equation*}
There is not a single easy formula that captures this summation nicely but with particular values for the parameters perhaps something could be simplified.
Proof.
\begin{align*}
M(t) & = \sum_{x=0}^{n_1} e^{tx} \frac{\binom{n_1}{x} \binom{n-n_1}{r-x}}{\binom{n}{r}}\\
& = \text{a mess...}
\end{align*}