Skip to main content

Section 2.4 Higher Degree Regression

Continuing in a similar fashion to the previous section, consider now an approximation using a quadratic function \(f(x) = ax^2 + bx + c\text{.}\) In this case, the total squared error would be of the form

\begin{equation*} TSE(a,b,c) = \sum_{k=0}^n (a x_k^2 + b x_k + c - y_k)^2. \end{equation*}

Taking all three partials gives

\begin{equation*} TSE_a = \sum_{k=1}^n 2(a x_k^2 + b x_k + c - y_k) \cdot x_k^2 \end{equation*}
\begin{equation*} TSE_b = \sum_{k=1}^n 2(a x_k^2 + b x_k + c - y_k) \cdot x_k \end{equation*}
\begin{equation*} TSE_c = \sum_{k=1}^n 2(a x_k^2 + b x_k + c - y_k) \cdot 1 . \end{equation*}

Once again, setting equal to zero and solving gives the normal equations for the best-fit quadratic

\begin{equation*} a \sum_{k=1}^n x_k^4 + b \sum_{k=1}^n x_k^3 + c \sum_{k=1}^n x_k^2 = \sum_{k=1}^n x_k^2 y_k \end{equation*}
\begin{equation*} a \sum_{k=1}^n x_k^3 + b \sum_{k=1}^n x_k^2 + c \sum_{k=1}^n x_k = \sum_{k=1}^n x_k y_k \end{equation*}
\begin{equation*} a \sum_{k=1}^n x_k^2 + b \sum_{k=1}^n x_k + c \sum_{k=1}^n 1 = \sum_{k=1}^n y_k. \end{equation*}

One can easily use software to perform these calculations of course. Further, you can extend the ideas from above and derivate formulas for a best-fit cubic. The interactive cell below determines the optimal quadratic polynomial for a given set of data points and by commenting and uncommenting as suggested will also determine the best fit cubic.