Paragraph

To express this relationship requires the use of multi-variate statistics. This text is focused on single-variable statistics so what follows will be a little careless. Take a followup course to rigorously develop what to do with various functions on random variables. For now, just consider this: Suppose that you plan on doing an experiment on some distribution with a given mean and given variance and that that experiment has random variable \(X_1\text{.}\) Planning to do the experiment again results in a random variable \(X_2\text{.}\) Continuing, you will get (say) n random experiments planned that will result in n different random variables
\begin{equation*} X_1, X_2, X_3, ... , X_n\text{.} \end{equation*}
You can then create a new random variable that might be a combination of those variables. One such random variable that is often chosen is by taking the sum of these theoretical values such as
\begin{equation*} Y = \sum_{k=1}^n X_k = X_1 + X_2 + X_3 + ... + X_n \end{equation*}
or the average of these variables to create the variable \(\overline{X}\) where
\begin{equation*} \overline{X} = \frac{\sum_{k=1}^n X_k}{n} = \frac{X_1 + X_2 + X_3 + ... + X_n}{n}. \end{equation*}
Below, notice that the new variable created comes by taking the sum of the squares of standard normal variables. This is indeed yet another possible function on random variables that establishes a relationship between normal and \(\chi^2\text{.}\)
in-context