Section 9.2 The Normal Distribution
Definition 9.2.1 The Normal Distribution
Given two parameters \(\mu\) and \(\sigma\text{,}\) a random variable X over \(R = (-\infty,\infty)\) has a normal distribution provided it has a probability function given by
\begin{equation*}
f(x) = \frac{1}{\sigma \sqrt{2 \pi}} e^{ -\left ( \frac{x-\mu}{\sigma} \right ) ^2 / 2}
\end{equation*}
The normal distribution is also sometimes referred to as the Gaussian Distribution (often by Physicists) or the Bell Curve (often by social scientists).
Theorem 9.2.2
If \(\mu = 0\) and \(\sigma = 1\text{,}\) then we say X has a standard normal distribution and often use Z as the variable name and will use \(\Phi(z)\) for the standard normal distribution function. In this case, the density function reduces to
\begin{equation*}
f(z) = \frac{1}{\sqrt{2 \pi}} e^{ -z^2 / 2}
\end{equation*}
Proof
Convert to "standard units" using the conversion
\begin{equation*}
z = \frac{x-\mu}{\sigma} = \frac{x-0}{1} = x.
\end{equation*}
Theorem 9.2.3 Verifying the normal probability function
\begin{equation*}
\int_{-\infty}^{\infty} \frac{1}{\sigma \sqrt{2 \pi}} e^{ -\left ( \frac{x-\mu}{\sigma} \right ) ^2 / 2} dx = 1
\end{equation*}
Proof
Note that you can convert the integral above to standard units so that it is sufficient to show
\begin{equation*}
I = \int_{-\infty}^{\infty} \frac{1}{\sqrt{2 \pi}} e^{ -\frac{z^2}{2} } dz = 1
\end{equation*}
Toward this end, consider \(I^2\) and change the variables to get
\begin{align*}
I^2 & = \int_{-\infty}^{\infty} \frac{1}{\sqrt{2 \pi}} e^{ -\frac{u^2}{2} } du \cdot \int_{-\infty}^{\infty} \frac{1}{\sqrt{2 \pi}} e^{ -\frac{v^2}{2} } dv\\
& = \frac{1}{2 \pi} \int_{-\infty}^{\infty} \int_{-\infty}^{\infty} e^{ -\frac{u^2+v^2}{2} } du dv
\end{align*}
Converting to polar coordinates using
\begin{equation*}
du dv = r dr d\theta
\end{equation*}
and
\begin{equation*}
u^2 + v^2 = r^2
\end{equation*}
gives
\begin{align*}
I^2 & = \frac{1}{2 \pi} \int_0^{2 \pi} \int_0^{\infty} e^{ -\frac{r^2}{2} } r dr d\theta\\
& = \frac{1}{2 \pi} \int_0^{2 \pi} -e^{ -\frac{r^2}{2} } \big |_0^{\infty} d\theta\\
& = \frac{1}{2 \pi} \int_0^{2 \pi} 1 \cdot d\theta\\
& = \frac{1}{2 \pi} \theta \big |_0^{2 \pi} = 1
\end{align*}
as desired.
Theorem 9.2.4 Verifying the normal probability mean
\begin{equation*}
E[X] = \int_{-\infty}^{\infty} x \cdot \frac{1}{\sigma \sqrt{2 \pi}} e^{ - \left ( \frac{x-\mu}{\sigma} \right ) ^2 / 2} dx = \mu
\end{equation*}
Proof
\begin{equation*}
z = \frac{x-\mu}{\sigma}
\end{equation*}
implies by solving that
\begin{equation*}
x = \mu + z \sigma
\end{equation*}
and therefore
\begin{align*}
E[X] &= \int_{-\infty}^{\infty} x \cdot \frac{1}{\sigma \sqrt{2 \pi}} e^{ - \left ( \frac{x-\mu}{\sigma} \right ) ^2 / 2} dx \\
&= \frac{1}{\sqrt{2 \pi}} \int_{-\infty}^{\infty} (\mu + z\sigma) \cdot e^{ -z^2 / 2} dz\\
&= \mu \cdot \frac{1}{\sqrt{2 \pi}} \int_{-\infty}^{\infty} e^{ -z^2 / 2} dz + \sigma \cdot \frac{1}{\sqrt{2 \pi}} \int_{-\infty}^{\infty} z \cdot e^{ -z^2 / 2} dz\\
&= \mu \cdot 1 + \sigma \cdot 0\\
& = \mu
\end{align*}
and therefore the use of \(\mu\) is warranted.
Theorem 9.2.5 Verifying the normal probability variance
\begin{equation*}
E[(X-\mu)^2] = \int_{-\infty}^{\infty} (x-\mu)^2 \cdot \frac{1}{\sigma \sqrt{2 \pi}} e^{ - \left ( \frac{x-\mu}{\sigma} \right ) ^2 / 2} dx = \sigma^2
\end{equation*}
Proof
\begin{align*}
E[(X-\mu)^2] & = \int_{-\infty}^{\infty} (x-\mu)^2 \cdot \frac{1}{\sigma \sqrt{2 \pi}} e^{ - \left ( \frac{x-\mu}{\sigma} \right ) ^2 / 2} dx\\
& = \frac{1}{\sqrt{2 \pi}} \int_{-\infty}^{\infty} \sigma^2 z^2 \cdot e^{ -z^2 / 2} dz\\
& = \frac{\sigma^2}{\sqrt{2 \pi}} \int_{-\infty}^{\infty} z \cdot z e^{ -z^2 / 2} dz\\
& = \frac{\sigma^2}{\sqrt{2 \pi}} \cdot \big [ -z e^{-z^2 / 2} \big |_{-\infty}^{\infty} + \int_{-\infty}^{\infty} e^{ -z^2 / 2} dz \big ]\\
& = \frac{\sigma^2}{\sqrt{2 \pi}} \cdot \big [ 0 + \sqrt{2 \pi} \big ]\\
& = \sigma^2
\end{align*}
using integration by parts and using the integration in the proof of the mean above. So, the use of \(\sigma\) is warranted.
Theorem 9.2.6 Properties of the Normal Distribution
Theorem 9.2.7 Normal Distribution Maximum
The maximum of the normal distribution probability function occurs when \(x = \mu\)
Proof
Take the derivative of the probability function to get
\begin{equation*}
\frac{\sqrt{2} {\left(\mu - x\right)} e^{\left(-\frac{{\left(\mu - x\right)}^{2}}{2 \, \sigma^{2}}\right)}}{2 \, \sqrt{\pi} \sigma^{3}}
\end{equation*}
which is zero only when \(x = \mu\text{.}\) Easily by evaluating to the left and right of this value shows that this critical value yields a maximum.
Theorem 9.2.8 Normal Distribution Points of Inflection
Points of Inflection for the normal distribution probability function occurs when \(x = \mu + \sigma\) and \(x = \mu - \sigma\text{.}\)
Proof
Take the second derivative of the probability function to get
\begin{equation*}
\frac{\sqrt{2} {\left(\mu + \sigma - x\right)} {\left(\mu - \sigma - x\right)} e^{\left(-\frac{\mu^{2}}{2 \, \sigma^{2}} + \frac{\mu x}{\sigma^{2}} - \frac{x^{2}}{2 \, \sigma^{2}}\right)}}{2 \, \sqrt{\pi} \sigma^{5}}
\end{equation*}
which is zero only when \(x = \mu \pm \sigma\text{.}\) Easily by evaluating to the left and right of this value shows that these critical values yield points of inflection.
Notice that the work needed to complete the integrals over the entire domain above was pretty serious. To determine probabilities for a given interval is however not possible in general and therefore approximations are needed. When using TI graphing calculators, you can use
\begin{equation*}
P( a \lt x \lt b ) = \text{normalcdf}(a,b,\mu, \sigma).
\end{equation*}
Or you can use the calculator below.