Skip to main content

Section 7.2 Binomial Distribution

Consider a sequence of n independent Bernoulli trials with the likelihood of a success p on each individual trial stays constant from trial to trial with \(0 \lt p \lt 1 \text{.}\) If we let the variable \(X\) measure the number of successes obtained when doing a fixed number of trials n with \(R = \{ 0, 1, ..., n \}\text{,}\) then the resulting distribution of probabilities is called a Binomial Distribution.

You can of course get specific values and graph the Binomial Distribution using R as well...

Since successive trials are independent, then the probability of X successes occurring within n trials is given by

\begin{equation*} P(X=x) = \binom{n}{x}P(SS...SFF...F) = \binom{n}{x}p^x(1-p)^{n-x} \end{equation*}

Using the Binomial Theorem with a = p and b = 1-p yields

\begin{equation*} \sum_{x=0}^n \binom{n}{x}p^x(1-p)^{n-x} = (p + (1-p))^n = 1 \end{equation*}

Utilize the interactive cell below to compute f(x) and F(x) for the Binomial distribution

For the mean,

\begin{align*} \mu & = E[X] \\ & = \sum_{x=0}^{n} {x \binom{n}{x} p^x (1-p)^{n-x}}\\ & = \sum_{x=1}^{n} {x \frac{n(n-1)!}{x(x-1)!(n-x)!} p^x (1-p)^{n-x}}\\ & = np \sum_{x=1}^{n} {\frac{(n-1)!}{(x-1)!((n-1)-(x-1))!} p^{x-1} (1-p)^{(n-1)-(x-1)}} \end{align*}

Using the change of variables \(k=x-1\) and \(m = n-1\) yields a binomial series

\begin{align*} & = np \sum_{k=0}^{m} {\frac{m!}{k!(m-k)!} p^k (1-p)^{m-k}}\\ & = np (p + (1-p))^m = np \end{align*}

For the variance,

\begin{align*} \sigma^2 & = E[X(X-1)] + \mu - \mu^2 \\ & = \sum_{x=0}^{n} {x(x-1) \binom{n}{x} p^x (1-p)^{n-x}} + np - n^2p^2\\ & = \sum_{x=2}^{n} {x(x-1) \frac{n(n-1)(n-2)!}{x(x-1)(x-2)!(n-x)!} p^x (1-p)^{n-x}} + np - n^2p^2\\ & = n(n-1)p^2 \sum_{x=2}^{n} {\frac{(n-2)!}{(x-2)!((n-2)-(x-2))!} p^{x-2} (1-p)^{(n-2)-(x-2)}} + np - n^2p^2 \end{align*}

Using the change of variables \(k=x-2\) and \(m = n-2\) yields a binomial series

\begin{align*} & = n(n-1)p^2 \sum_{k=0}^{m} {\frac{m!}{k!(m-k)!} p^k (1-p)^{m-k}} + np - n^2p^2\\ & = n(n-1)p^2 + np - n^2p^2 = np - np^2 = np(1-p) \end{align*}

The skewness and kurtosis can be found similarly using formulas involving E[X(X-1)(X-2)] and E[X(X-1)(X-2)(X-3)]. The complete determination is performed using Sage below.

The following uses Sage to determine the general formulas for the Binomial distribution.

Flipping Coins

Suppose you flip a coin exactly 20 times. Determine the probability of getting exactly 10 heads and then determine the probability of getting 10 or fewer heads.

Solution

This is binomial with n = 20, p = 1/2 and you are looking for f(10). With these values

\begin{equation*} f(10) = \binom{20}{10} \cdot \left ( \frac{1}{2} \right )^{10} \cdot \left ( \frac{1}{2} \right )^{20-10} = \frac{46189}{262144} \approx 0.176 \end{equation*}

Notice, the mean for this distribution is also 10 so one might expect 10 heads in general. Next, to determine the probability for 10 or fewer heads requires F(10) = f(0) + f(1) + ... + f(10). There is no "nice" formula for F but this calculation can be performed using a graphing calculator, such as the TI-84 with F(x) = binomcdf(n,p,x). In this case, F(10) = binomcdf(20,1/2,10) = 0.588.