Section 8.2 Poisson Distribution
Consider a Poisson Process where you start with an interval of fixed length T and where X measures the variable number of successes, or changes, within a that interval. The resulting distribution of X will be called a Poisson distribution.
Theorem 8.2.1 Poisson Probability Function
Assume X measures the number of successes in an interval [0,T] within some Poisson process. Then,
\begin{equation*}
f(x) = \frac{\mu^x e^{-\mu}}{x!}
\end{equation*}
for \(R = \{ 0, 1, 2, ... \}\text{.}\)
Proof
For a sufficiently large natural number n, break up the given interval [0,T] into n uniform parts each of width h = T/n. Using the properties of Poisson processes, n very large implies h will be very small and eventually small enough so that
\begin{equation*}
P(\text{exactly one success on a given interval}) = p = \lambda \frac{T}{n}.
\end{equation*}
However, since there are a finite number of independent intervals each with probability p of containing a success then you can use a Binomial distribution to evaluate the corresponding probabilities so long as n is finite. Doing so yields and taking the limit as n approaches infinity gives:
\begin{align*}
f(x) & = P(\text{X changes in [0,T]}) \\
& = \lim_{n \rightarrow \infty} \binom{n}{x} p^x (1-p)^{n-x}\\
& = \lim_{n \rightarrow \infty} \binom{n}{x} (\frac{\lambda T}{n})^x (1-\lambda \frac{T}{n})^{n-x}\\
& = \lim_{n \rightarrow \infty} \frac{n(n-1)...(n-x+1)}{x!} ( \frac{\lambda T}{n})^x (1- \frac{\lambda T}{n})^{n-x}\\
& = \frac{(\lambda T)^x}{x!} \lim_{n \rightarrow \infty} \frac{n(n-1)...(n-x+1)}{n \cdot n \cdot ... \cdot n} (1-\lambda \frac{T}{n})^{n}(1-\lambda \frac{T}{n})^{-x}\\
& = \frac{(\lambda T)^x}{x!} \lim_{n \rightarrow \infty} (1-\frac{1}{n})...(1-\frac{x-1}{n}) (1- \frac{\lambda T}{n})^{n}(1- \frac{\lambda T}{n})^{-x}\\
& = \frac{(\lambda T)^x}{x!}
\lim_{n \rightarrow \infty} (1- \frac{\lambda T}{n})^{n}
\lim_{n \rightarrow \infty} (1- \frac{\lambda T}{n})^{-x}\\
& = \frac{(\lambda T)^x}{x!}
\lim_{n \rightarrow \infty} (1- \frac{\lambda T}{n})^{n} \cdot 1\\
& = \frac{(\lambda T)^x}{x!}
e^{-\lambda T}
\end{align*}
Theorem 8.2.2 Verify Poisson Probability Function
\begin{equation*}
\sum_{x=0}^{\infty} \frac{(\lambda T)^x}{x!} e^{-\lambda T} = 1
\end{equation*}
Proof
Using the Power Series expansion for the natural exponential,
\begin{align*}
\sum_{x=0}^{\infty} f(x) & = \sum_{x=0}^{\infty} \frac{(\lambda T)^x}{x!} e^{-\lambda T} \\
& = e^{-\lambda T} \sum_{x=0}^{\infty} \frac{(\lambda T)^x}{x!} \\
& = e^{-\lambda T} e^{\lambda T} \\
& = 1
\end{align*}
Theorem 8.2.3 Statistics for Poisson
\begin{equation*}
\mu = \lambda T
\end{equation*}
\begin{equation*}
\sigma^2 = \mu
\end{equation*}
\begin{equation*}
\gamma_1 = \frac{1}{\sqrt{\mu}}
\end{equation*}
\begin{equation*}
\gamma_2 = \frac{1}{\mu}+3
\end{equation*}
Proof
Using the f(x) generated in the previous theorem
\begin{align*}
\mu & = E[X] \\
& = \sum_{x=0}^{\infty} x \cdot \frac{(\lambda T)^x}{x!} e^{-\lambda T}\\
& = \lambda T e^{-\lambda T} \sum_{x=1}^{\infty} \frac{(\lambda T)^{x-1}}{(x-1)!} \\
& = \lambda T e^{-\lambda T} \sum_{k=0}^{\infty} \frac{(\lambda T)^k}{k!} \\
& = \lambda T e^{-\lambda T} e^{\lambda T} \\
& = \lambda T
\end{align*}
which confirms the use of \(\mu\) in the original probability formula.
Continuing with \(\mu = \lambda T\text{,}\) the variance is given by
\begin{align*}
\sigma^2 & = E[X(X-1)] + \mu - \mu^2 \\
& = \sum_{x=0}^{\infty} x(x-1) \cdot \frac{\mu^x}{x!} e^{-\mu} + \mu - \mu^2\\
& = e^{-\mu} \mu^2 \sum_{x=2}^{\infty} \frac{\mu^{x-2}}{(x-2)!} + \mu - \mu^2\\
& = e^{-\mu} \mu^2 \sum_{k=0}^{\infty} \frac{\mu^k}{k!} + \mu - \mu^2\\
& = \mu^2 + \mu - \mu^2 \\
& = \mu
\end{align*}
To derive the skewness and kurtosis, you can depend upon Sage...see the live cell below.
Approximation by binomial means you can also use Poisson to approximate Binomial for n sufficiently large.