Section 2.4 Higher Degree Regression
Continuing in a similar fashion to the previous section, consider now an approximation using a quadratic function In this case, the total squared error would be of the form
Taking all three partials gives
Once again, setting equal to zero and solving gives the normal equations for the best-fit quadratic
One can easily use software to perform these calculations of course. Further, you can extend the ideas from above and derivate formulas for a best-fit cubic. The interactive cell below determines the optimal quadratic polynomial for a given set of data points and by commenting and uncommenting as suggested will also determine the best fit cubic.
xxxxxxxxxx
var('x')
xpts = vector([-1, 0, 1, 3, 5, 5])
ypts = vector([5, 3, 0, -1, 3, 6])
ones = vector([1, 1, 1, 1, 1, 1])
xpts3 = []
xpts2 = []
pts = []
for k in range(len(xpts)):
# xpts3.append(xpts[k]^3) # uncomment this for best cubic
xpts2.append(xpts[k]^2)
pts.append((xpts[k],ypts[k]))
# xpts3 = vector(xpts3) # uncomment this for best cubic
xpts2 = vector(xpts2)
# For best cubic, instead uncomment the first and comment the second
# X = matrix([xpts3]).stack(xpts2).stack(xpts).stack(ones).transpose()
X = matrix([xpts2]).stack(xpts).stack(ones).transpose()
Y = matrix(ypts).transpose()
Xt = X.transpose()
print(X)
A = (Xt*X).inverse()*Xt*Y
# For best cubic, instead uncomment the first and comment the second
# [a,b,c,d] = [A[0][0],A[1][0],A[2][0],A[3][0]]
[a,b,c] = [A[0][0],A[1][0],A[2][0]]
# For best cubic, instead uncomment the first and comment the second
# f = a*x^3 + b*x^2 + c*x + d
f = a*x^2 + b*x + c
banner = "The interpolant is given by $%s$"%str(latex(f))
G = points(pts,size=20)
H = plot(f,x,min(xpts),max(xpts),title=banner)
show(G+H)