Skip to main content

Section 6.3 Continuous Uniform Distribution

Modeling the idea of "equally-likely" in a continuous world requires a slightly different perspective since there are obviously infinitely many outcomes to consider. Instead, you should consider requiring that intervals in the domain which are of equal width should have the same probability regardless of where they are in that domain. This behaviour suggests

\begin{equation*} P(u \lt X \lt v) = P(u + \Delta \lt X \lt v + \Delta) \end{equation*}

for reasonable values of \(\Delta\) so that the interval remains inside \(R\text{.}\)

From before, for X a continuous uniform variable, we get

\begin{gather*} \int_u^v f(x) dx = \int_{u+\Delta}^{v+\Delta} f(x) dx\\ F(v)-F(u) = F(v+\Delta)-F(u+\Delta)\\ F(u+\Delta)-F(u) = F(v+\Delta)-F(v)\\ \frac{F(u+\Delta)-F(u)}{\Delta} = \frac{F(v+\Delta)-F(v)}{\Delta} \end{gather*}

which is true regardless of \Delta so long as you stay in the domain of interest. Letting \(\Delta \rightarrow 0\) gives

\begin{equation*} F'(u) = F'(v) \end{equation*}

but since F is an antiderivative of the probability function,

\begin{equation*} f(u) = f(v) \end{equation*}

for all u and v in R. This only happens if f is constant...say, f(x)=c. If the space of X is a single interval with \(R = [a,b]\) then

\begin{equation*} 1 = \int_a^b c dx = c(b-a) \end{equation*}

which yields \(c = \frac{1}{b-a}\) as desired.

Example 6.3.2. Basic Continuous Uniform.

On \(R = [1,2 \pi]\text{,}\)

\begin{equation*} f(x) = \frac{1}{2 \pi - 1}. \end{equation*}

Then, if you want to compute something like \(P(2 < X < 4.5)\) integrate

\begin{equation*} P(2 < X < 4.5) = \int_2^{4.5} \frac{1}{2 \pi -1} dx = \frac{2.5}{2 \pi - 1} \end{equation*}

Checkpoint 6.3.3. WebWork - Continuous Uniform.

Example 6.3.4. Continuous Uniform over two disjoint intervals.

Suppose \(R = [0,2] \cup [5,7]\text{.}\) Then, as in the theorem proof

\begin{equation*} 1 = \int_R c \cdot dx = \int_0^2 c \cdot dx + \int_5^7 c \cdot dx = 4c. \end{equation*}

Thus, \(f(x) = \frac{1}{4}\text{.}\) For computing probabilities, you will want to break up any resulting integrals in a similar manner.

We can verify most of these here but you can also determine these using Sage below.

For the mean 1,

\begin{align*} \mu & = E[X] = \int_a^b x \frac{1}{b-a} dx\\ & = \frac{x^2}{2(b-a)} right |_a^b\\ & = \frac{b^2-a^2}{2(b-a)} = \frac{b+a}{2}. \end{align*}

For the variance 2,

\begin{align*} \sigma^2 & = E[X^2] - \mu^2 = \int_a^b x^2 \frac{1}{b-a} dx - \mu^2\\ & = \frac{x^3}{3(b-a)} right |_a^b - \left ( \frac{a+b}{2} \right )^2\\ & = \frac{b^3-a^3}{3(b-a)} - \frac{a^2 + 2ab + b^2}{4}\\ & = \frac{4 b^2 + 4 ab + 4 a^2 - 3a^2 - 6 ab - 3b^2}{12}\\ & = \frac{b^2-2ab+a^2}{12} = \frac{(b-a)^2}{12}. \end{align*}

For the skewness 3,

\begin{align*} \gamma_0 & = E[X^3] - 3 \mu E[X^2] + 2 \mu^3\\ & = \int_a^b x^3 \frac{1}{b-a} dx - 3 \mu \frac{b^3-a^3}{3(b-a)} + 2 \left ( \frac{a+b}{2} \right )^3\\ & = \frac{x^4}{4(b-a)} right |_a^b - 3 \frac{a+b}{2} \cdot \frac{b^3-a^3}{3(b-a)} + 2 \frac{a^3 + 3a^2 b + 3a b^2 + b^3}{8} \\ & = \text{a miracle of algebra}\\ & = 0. \end{align*}

The kurtosis 4 is more algebra like above. We will just let Sage do that part for us below.

Example 6.3.6. Occurence of exactly one event randomly in a given interval.

Suppose you know that only one person showed up at the counter of a local business in a given 30 minute interval of time. Then, \(R\) = [0,30] given \(f(x) = 1/30\text{.}\)

Further, the probability that the person arrived within the first 6 minutes would be \(\int_0^6 \frac{1}{30} dx = 0.2\text{.}\)

For x in this range,

\begin{equation*} F(x) = \int_a^x \frac{1}{b-a} du = \frac{u}{b-a} \big |_a^x = \frac{x-a}{b-a}. \end{equation*}