Section 9.3 Chi-Square Distribution
The following distribution is related to both the Normal Distribution and to the Gamma Distribution 8.4.3. Initially, consider a gamma distribution with probability function given by the formula
Replacing \(\mu = 2\) and r with r/2 gives the special case
which is given a special name below.
Given an natural number r, suppose X is a random variable over the space \(R = (0,\infty)\) with probability function given by Then X has a Chi-Square distribution with r degrees of freedom. This is often denoted \(\chi^2(r)\text{.}\)
Definition 9.3.1. Chi-Square Probability Function.
As with all distributions before, one can determine the mean, variance, skewness and kurtosis for a general \(\chi^2\) distribution directly. However, one can also note that these should be special cases of the gamma distribution and indeed that is the case.
Theorem 9.3.2. \(\chi^2\) statistics.
Proof.
So, if the \(\chi^2\) distribution is based upon the gamma distribution, why might one want to save it for this point in time? The Gamma distribution has a specific setup for the random variable for solving a particular problem...finding the probability that it takes an amount of time in order to reach a defined number of successes. \(\chi^2\) simply is created by using a redesign of the gamma formula but with no particular problem to solve in mind. However, \(\chi^2\) has a number of properties that are useful for making inferences from sample data as you will see later. The theorem below shows an important relationship between the \(\chi^2\) distribution and the standard normal distribution.
To express this relationship requires the use of multi-variate statistics. This text is focused on single-variable statistics so what follows will be a little careless. Take a followup course to rigorously develop what to do with various functions on random variables. For now, just consider this: Suppose that you plan on doing an experiment on some distribution with a given mean and given variance and that that experiment has random variable \(X_1\text{.}\) Planning to do the experiment again results in a random variable \(X_2\text{.}\) Continuing, you will get (say) n random experiments planned that will result in n different random variables
You can then create a new random variable that might be a combination of those variables. One such random variable that is often chosen is by taking the sum of these theoretical values such as
or the average of these variables to create the variable \(\overline{X}\) where
Below, notice that the new variable created comes by taking the sum of the squares of standard normal variables. This is indeed yet another possible function on random variables that establishes a relationship between normal and \(\chi^2\text{.}\)
Theorem 9.3.3. Relationship between Normal and \(\chi^2\).
If \(Z_1, Z_2, ..., Z_r\) are r standard normal variables, then
It also can be difficult to compute Chi-Square probabilities manually so you will perhaps want to use a numerical approximation in this case as well. The TI graphing calculator can be used with \(P(a \le X \le b) \approx \chi^2\)cdf(a,b,r). Or you can use the interactive cell below.
Checkpoint 9.3.4.
As with the normal distribution, there is also a way to compute the inverse Chi-Square function using Sage.