Weierstrass Approximation Theorem: If feC[a,b], then there exists a polynomial P(x) such that P(x) is arbitrarily close to f(x) on [a,b].
Remark: This is an existence theorem. In practice, finding the polynomial might be very difficult and of very high degree.
Taylor Polynomials: These are created by using Taylor's theorem and throwing away the remainder term.
Bad Ex: Approximate f(x)=1/x about c=1
using higher and higher Taylor's polynomials. The approximation gets worse
and worse as the degree of the polynomial gets larger.
LaGrange Polynomials: Given n+1 values of the function (x0,f(x0)) , (x1,f(x1)) , ... , (xn,f(xn)), find the polynomial of degree at most n which passes through all these points.
Derivation:
Neville's Method: A recursive way to generate values of the Lagrange polynomial. It does not, however, give a formula but a value for each variable x.
Error bound: See page 57 for bound on the error of approximating f(x) with it's Lagrange polynomial.
Result: Any two polynomials of the same degree n which agree at n+1 points are equal.
Pf: Suppose f(x), g(x) are of degree n and agree at n+1 points.Ex: Construct a parabola passing through (1,4), (-1,0) and (2,9). Then p(x)= ax2 + bx + c and by plugging in the points,
Then, r(x)=f(x)-g(x) is of degree at most n with n+1 roots gives r(x)=0.
which is a 3x3 system of equations to solve. If the polynomial p(x) were of higher degree, then hard to solve.4 = a + b + c
0 = a - b + c
9 = 4a + 2b + c
But if we write p(x)=c + b(x-1) + a(x-1)(x+1), then plugging in the points yields (solving as we go along)
which is very easy to solve successively. This second form is called Newton's form.4 = c
0 = c - 2b, or b=2
9 = c + b + 3a, or a=1,
Newton's Divided Differences: Successively
build up the polynomial by finding the coefficients as above.
(x0,f0) gives p0(x) = f0, passes through the one point.Continuing in this manner shows the coefficients bi are given by the divided difference table, page 63, etc.(x1,f1) gives p1(x) = p0(x) + (x-x0)b1, passes through the first two points if b1=(f1-f0)/(x1-x0).
(x2,f2) gives p2(x) = p1(x) + (x-x0)(x-x1)b2, passes through the first three points if b2=?.
f[xj,...,xk] = (f[xj+1,...,xk] - f[xj,...,xk-1]) / (xk - xj), for j<k.If x0 < x1 < ... < xn, this is called Newton's Forward Divided Difference Polynomial.
If x0 > x1 > ... > xn, this is called Newton's Backward Divided Difference Polynomial.
If Dx
is constant, the Divided Difference Formulas can be written in a closed
form involving the binomial coefficents.
Program Subroutine To Determine the Newton's Divided Differences:
Storage: Only need one vector, replacing as below:
f0= c0
f1 f[x0,x1] = c1
f2 f[x1,x2] f[x0,...,x2] = c2
...
fn f[xn-1,xn]
f[xn-2,...,xn]
f[xn-3,...,xn] ... f[x0,...,xn]
=
cn
Programming Assignment: Implement Newton's divided difference method for determining the interpolating polynomial for an input set (of up to 20) data points. Evaluate the resulting polynomial for at least 100 equally spaced values and write these (x,y) points to a file. Read this file into some graphing software and graph them there. Use several example data sets and see what you get for approximations. For one example, compute your data points using the function f(x)=1/(1+x2) on the interval [-5,5]. What happens as the number of data points "n" used here increases?
(To do this project, perhaps write a function which
accepts as input the points to interpolate and returns the Newton's coefficients.
Write another function which accepts as input the x-values and these Newton's
coefficients and returns one y-value along the interpolant.)
Hermite Interpolation: Not only are the function
values specified but also several derivative values. We will only consider
the case when we match function values and 1st derivative values. This
is implemented using a set of basis functions similar to Lagrange.
Piecewise Interpolation: Instead of using one polynomial (possibly of very high degree) for the entire interval of interest, use several polynomials each defined on small intervals. Then, to obtain the y-value for a given x-value, one must first determine which formula to use. Then, evaluate that formula for the given x-value. If x0 < x1 < ... < xn, and x0 < x < xn, then one can determine which interval x belongs to by considering the iteration:
p(u) = H0(u) f(0) + H1(u) f '(0) + H2(u) f '(1) + H3(u) f(1) , where
If data points (xk, yk)
and slope values mk are given, then the piecewise formula for
the interval [xk-1 , xk] is given by
pk(x) = H0( (x-xk-1)/Dxk )yk-1 + H1((x-xk-1)/Dxk )mk-1Dxk + H2((x-xk-1)/Dxk )mkDxk + H3((x-xk-1)/Dxk )yk.
Notice, if we desire to evaluate this piecewise approximation
for a fixed number of equally spaced points in each sub-interval, we can
evaluate the basis polynomials once and store the results in vectors. These
then can be reused for each interval.
Cubic Splines: Piece cubics Sk(x) = ak + bk(x-xk-1) + ck(x-xk-1)2 + dk(x-xk-1)3, for xk-1 < x < xk, together in such a way that the resulting approximation is continuous and has 2 continuous derivatives (no visible kinks). So, we impose the conditions:
End Conditions:
Introduce Bezier curves as recursively linearly interpolating....
Bezier Interpolants: Suppose that we are given "control points" bk = (bxk, byk) such that we desire a polynomial interpolant which passes through the first control point b0 and the last one bn, while the other points serving as attractors of the curve. Then, the curve can be given parametrically as
Derivative Continuity of Bezier Interpolants:
The derivative of a Bezier curve is a Bezier curve of one less degree by
simply taking the derivative of the formulas above. By equating these derivatives
at the points where two curves meet, we force the resulting piecewise interpolant
to be smooth. We can continue to force successive derivative to be equal
at the points where the curves meet to get an interpolant with very high
degree of smoothness. Notice, the similarity to splines.