Skip to main content

Section 5.4 Expected Value

Blaise Pascal was a 17th century mathematician and philosopher who was accomplished in many areas but may likely be best known to you for his creation of what is now known as Pascal's Triangle. As part of his philosophical pursuits, he proposed what is known as "Pascal's wager". It suggests two mutually exclusive outcomes: that God exists or that he does not. His argument is that a rational person should live as though God exists and seek to believe in God. If God does not actually exist, such a person will have only a finite loss (some pleasures, luxury, etc.), whereas they stand to receive infinite gains as represented by eternity in Heaven and avoid an infinite losses of eternity in Hell. This type of reasoning is part of what is known as "decision theory".

You may not confront such dire payouts when making your daily decisions but we need a formal method for making these determinations precise. The procedure for doing so is what we call expected value.

Definition 5.4.1 Expected Value

Given a random variable X over space R, corresponding probability function f(x) and "value function" u(x), the expected value of u(x) is given by

\begin{equation*} E = E[u(X)] = \sum_{x \in R} u(x) f(x) \end{equation*}

provided X is discrete, or

\begin{equation*} E = E[u(X)] = \int_R u(x)f(x) dx \end{equation*}

provided X is continuous.

Each of these follows by utilizing the corresponding linearity properties of the summation and integration operations. For example, to verify part three in the continuous case:

\begin{align*} E[u(X) + v(X)] & = \int_{x \in R} [u(x)+v(x)]f(x) dx\\ & = \int_{x \in R} u(x)f(x) dx + \int_{x \in R} u(x)f(x) dx\\ & = E[u(X)] + E[v(X)]. \end{align*}

Consider \(f(x) = x/10\) over R = {1,2,3,4} where the payout is 10 euros if x=1, 5 euros if x=2, 2 euros if x=3 and -7 euros if x = 4. Then your value function would be u(1)=10, u(2) = 5, u(3)=2, and u(4) = -7. Computing the expect payout gives

\begin{equation*} E = 10 \times 1/10 + 5 \times 2/10 + 2 \times 3/10 - 7 \times 4/10 = -2/10 \end{equation*}

Therefore, the expected payout is actually negative due to a relatively large negative payout associated with the largest likelihood outcome and the larger positive payout only associated with the least likely outcome.

Consider \(f(x) = x^2/3\) over R = [-1,2] with value function given by \(u(x) = e^x - 1\text{.}\) Then, the expected value for u(x) is given by

\begin{equation*} E = \int_{-1}^2 (e^x-1) \cdot x^2/3 = -1/9 \cdot (e + 15) \cdot e^{-1} + 2/3 \cdot e^2 - 8/9 \approx 3.3129 \end{equation*}
Definition 5.4.5 Theoretical Measures

Given a random variable with probability function f(x) over space R

  1. The mean of X = \(\mu = E[x]\)
  2. The variance of X = \(\sigma^2 = E[(x-\mu)^2]\)
  3. The skewness of X = \(\gamma_1 = \frac{E[(x-\mu)^3]}{\sigma^3}\)
  4. The kurtosis of X = \(\gamma_2 = \frac{E[(x-\mu)^4]}{\sigma^4}\)

In each case, expand the binomial inside and use the linearity of expected value.

Consider the following example when computing these statistics for a discrete variable. In this case, we will utilize a variable with a relatively small space so that the summations can be easily done by hand. Indeed, consider

X f(x)
0 0.10
1 0.25
2 0.40
4 0.15
7 0.10
Table 5.4.7 Discrete Probability Function Example

Using the definition of mean as a sum,

\begin{align*} \mu & = 0 \cdot 0.10 + 1 \cdot 0.25 + 2 \cdot 0.40 + 4 \cdot 0.15 + 7 \cdot 0.10\\ & = 0 + 0.25 + 0.80 + 0.60 + 0.70\\ & = 2.35 \end{align*}

Notice where this lies on the probability histogram for this distribution.

For the variance

\begin{align*} \sigma^2 & = E[X^2] - \mu^2\\ & = \left [ 0^2 \cdot 0.10 + 1^2 \cdot 0.25 + 2^2 \cdot 0.40 + 4^2 \cdot 0.15 + 7^2 \cdot 0.10 \right ] - 2.35^2\\ & = 0 + 0.25 + 1.60 + 2.40 + 4.90 - 5.5225\\ & = 9.15 - 5.225\\ & = 3.6275 \end{align*}

and so the standard deviation \(\sigma = \sqrt{3.6275} \approx 1.90\text{.}\) Notice that 4 times this value encompasses almost all of the range of the distribution.

For the skewness

\begin{align*} \text{Numerator = } & E[X^3] - 3 \mu E[X^2] + 2\mu^3\\ & = \left [ 0^3 \cdot 0.10 + 1^3 \cdot 0.25 + 2^3 \cdot 0.40 + 4^3 \cdot 0.15 + 7^3 \cdot 0.10 \right ] - 3 \cdot 2.35 \cdot 9.15 + 2 \cdot 2.35^3\\ & \approx 0 + 0.25 + 3.20 + 9.60 + 34.3 - 64.5075 + 25.96\\ & = 47.35 - 64.5075 + 25.96\\ & \approx 8.80 \end{align*}

which yields a skewness of \(\gamma_1 = 8.80 / \sigma^3 \approx 1.27 \text{.}\) This indicates a slight skewness to the right of the mean. You can notice the 4 and 7 entries on the histogram illustrate a slight trailing off to the right.

Finally, for kurtosis

\begin{align*} \text{Numerator = } & E[X^4] - 4 \mu E[X^3] + 6 \mu^2 E[X^2] - 3\mu^4\\ & = \left [ 0^4 \cdot 0.10 + 1^4 \cdot 0.25 + 2^4 \cdot 0.40 + 4^4 \cdot 0.15 + 7^4 \cdot 0.10 \right ] - 4 \cdot 2.35 \cdot 47.35 + 6 \cdot 2.35^2 \cdot 9.15^2 - 3 \cdot 2.35^4\\ & \approx 0 + 0.25 + 6.40 + 38.4 + 240.1 - 445.09 + 303.19 - 91.49\\ & \approx 285.15 - 445.09 + 303.19 - 91.49\\ & \approx 51.75 \end{align*}

which yields a kurtosis of \(\gamma_2 = 51.75 / \sigma^4 \approx 3.93\) which also notes that the data appears to have a modestly bell-shaped distribution.

Consider the following example when computing these statistics for a continuous variable. Let \(f(x) = \frac{3}{4} \cdot (1-x^2)\) over R = [-1,1].

Then for the mean

\begin{align*} \mu & = \int_{-1}^1 x \cdot \frac{3}{4} \cdot (1-x^2) dx\\ & = \int_{-1}^1 \frac{3}{4} \cdot (x-x^3) dx\\ & = \frac{3}{4} \cdot (x^2/2-x^4/4) \big |_{-1}^1\\ & = \frac{3}{4} \cdot [(1/2)-(1/4)] - [(1/2) - (1/4)]\\ & = 0 \end{align*}

as expected since the probability function is symmetric about x=0.

For the variance

\begin{align*} \sigma^2 & = \int_{-1}^1 x^2 \cdot \frac{3}{4} \cdot (1-x^2) dx - \mu^2\\ & = \int_{-1}^1 \cdot \frac{3}{4} \cdot (x^2-x^4) dx - 0\\ & = \frac{3}{4} \cdot (x^3 /3 -x^5 / 5) \big |_{-1}^1\\ & = \frac{3}{4} \cdot 2 \cdot (1/3-1/5)\\ & = \frac{3}{4} \cdot \frac{4}{15}\\ & = \frac{1}{5} \end{align*}

and taking the square root gives a standard deviation slightly less than 1/2. Notice that four times this value encompasses almost all of the range of the distribution.

For the skewness, notice that the graph is symmetrical about the mean and so we would expect a skewness of 0. Just to check it out

\begin{align*} \text{Numerator = } & E[X^3] - 3 \mu E[X^2] + 2\mu^3\\ & = \int_{-1}^1 x^3 \cdot \frac{3}{4} \cdot (1-x^2) dx - 3 E[X^2] \cdot 0 + 0^3 \\ & = \int_{-1}^1 \cdot \frac{3}{4} \cdot (x^3-x^5) dx\\ & = \frac{3}{4} \cdot (x^4/4-x^6/6) \big |_{-1}^1\\ & = 0 \end{align*}

as expected without having to actually complete the calculation by dividing by the cube of the standard deviation.

Finally, note that the probability function in this case is modestly close to a bell shaped curve so we would expect a kurtosis in the vicinity of 3. Indeed, noting that (conveniently) \(\mu = 0\) gives

\begin{align*} \text{Numerator = } & E[X^4] - 4 \mu E[X^3] + 6 \mu^2 E[X^2] - 3 \mu^4\\ & = \int_{-1}^1 x^4 \cdot \frac{3}{4} \cdot (1-x^2) dx\\ & = \frac{3}{4} \cdot (x^5 /5-x^7 /7) \big |_{-1}^1\\ & = \frac{3}{4} \cdot 2(1/5-1/7)\\ & = \frac{3}{35} \end{align*}

and so by dividing by \(\sigma^4 = \sqrt{\frac{1}{5}}^4 = \frac{1}{25}\) gives a kurtosis of

\begin{equation*} \gamma_2 = \frac{3}{35} / \frac{1}{25} = \frac{75}{35} \approx 2.14. \end{equation*}

Roulette is a gambling game popular in may casinos in which a player attempts to win money from the casino by predicting the location that a ball lands on in a spinning wheel. There are two variations of this game...the American version and the European version. The difference being that the American version has one additional numbered slot on the wheel. The American version of the game will be used for the purposes of this example.

A Roulette wheel consists of 38 equally-sized sectors identified with the numbers 1 through 36 plus 0 and 00. The 0 and 00 sectors are colored green and half of the remaining numbers are in sectors colored red with the remainder colored black. A steel ball is dropped onto a spinning wheel and as the wheel comes to rest the sector in which it comes to rest is noted. It is easy to determine that the probability of landing on any one of the 38 sectors is 1/38. A picture of a typical American-style wheel and betting board is given by

. (Found at BigFishGames.com.)

Since this is a game in a casino, there must be the opportunity to bet (and likely lose) money. For the remainder of this example we will assume that you are betting 1 dollar each time. If you were to bet more then the values would scale correspondingly. However, if you place your bet on any single number and the ball ends up on the sector corresponding to that number, you win a net of 35 dollars. If the ball lands elsewhere you lose your dollar. Therefore the expected value of winning if you bet on one number is

\begin{equation*} E[\text{win on one}] = 35 \cdot \frac{1}{38} - 1 \cdot \frac{37}{38} = - \frac{2}{38} \end{equation*}

which is a little more than a nickel loss on average.

You can bet on two numbers as well and if the ball lands on either of the two then you win a payout in this case of 17 dollars. Therefore the expected value of winning if you bet on two numbers is

\begin{equation*} E[\text{win on two numbers}] = 17 \cdot \frac{2}{38} - 1 \cdot \frac{36}{38} = - \frac{2}{38}. \end{equation*}

Continuing, you can bet on three numbers and if the ball lands on any of the three then you win a payout of 11 dollars. Therefore the expected value of winning if you bet on three numbers is

\begin{equation*} E[\text{win on three numbers}] = 11 \cdot \frac{3}{38} - 1 \cdot \frac{35}{38} = - \frac{2}{38}. \end{equation*}

You can bet on all reds, all blacks, all evens (ignoring 0 and 00), or all odds and get your dollar back. The expected value for any of these options is

\begin{equation*} E[\text{win on eighteen numbers}] = 1 \cdot \frac{18}{38} - 1 \cdot \frac{20}{38} = - \frac{2}{38}. \end{equation*}

There is one special way to bet which uses the the 5 numbers {0, 00, 1, 2, 3} and pays 6 dollars. This is called the "top line of basket". Notice that the use of five numbers will make getting the same expected value as the other cases impossible using regular dollars and cents. The expected value of winning in this case us

\begin{equation*} E[\text{win on top line of basket}] = 6 \cdot \frac{5}{38} - 1 \cdot \frac{33}{38} = - \frac{3}{38} \end{equation*}

which is of course worse and is the only normal way to bet on roulette which has a different expected value.

There are other possible ways to bet on roulette but none provide a better expected value of winning. The moral of this story is that you should never bet on the 5 number option and if you ever get ahead by winning on roulette using any of the possible options then you should probably stop quickly since over a long period of time it is expected that you will lose an average of \(\frac{1}{19}\) dollars per game.

Going back to Pascal's wager, let X = 0 represent disbelief when God doesn't exist and X = 1 represent disbelief when God does exist, X = 2 represent belief when God does exist, and X = 3 represent belief when God does not exist. Let p be the likelihood that God exists. Then you can compute the expected value of disbelief and the expect value of belief by first creating a value function. Below, for argument sake we are somewhat randomly assign a value of one million to disbelief if God doesn't exist. The conclusions are the same if you choose any other finite number...

\begin{gather*} u(0) = 1,000,000, f(0) = 1-p\\ u(1) = -\infty, f(1) = p\\ u(2) = \infty, f(2) = p\\ u(3) = 0, f(3) = 1-p \end{gather*}

Then,

\begin{align*} E[\text{disbelief}] & = u(0)f(0) + u(1)f(1)\\ & = 1000000 \times (1-p) - \infty \times p\\ & = -\infty \end{align*}

if p>0. On the other hand,

\begin{align*} E[\text{belief}] & = u(2)f(2) + u(3)f(3)\\ & = \infty \times p + 0 \times (1-p)\\ & = \infty \end{align*}

if p>0. So Pascal's conclusion is that if there is even the slightest chance that God exists then belief is the smart and scientific choice.