Skip to main content

Section 2.4 Higher Degree Regression

Continuing in a similar fashion to the previous section, consider now an approximation using a quadratic function f(x)=ax2+bx+c. In this case, the total squared error would be of the form
TSE(a,b,c)=k=0n(axk2+bxk+cyk)2.
Taking all three partials gives
TSEa=k=1n2(axk2+bxk+cyk)xk2
TSEb=k=1n2(axk2+bxk+cyk)xk
TSEc=k=1n2(axk2+bxk+cyk)1.
Once again, setting equal to zero and solving gives the normal equations for the best-fit quadratic
ak=1nxk4+bk=1nxk3+ck=1nxk2=k=1nxk2yk
ak=1nxk3+bk=1nxk2+ck=1nxk=k=1nxkyk
ak=1nxk2+bk=1nxk+ck=1n1=k=1nyk.
One can easily use software to perform these calculations of course. Further, you can extend the ideas from above and derivate formulas for a best-fit cubic. The interactive cell below determines the optimal quadratic polynomial for a given set of data points and by commenting and uncommenting as suggested will also determine the best fit cubic.