Theorem (Taylor's Formula with Remainder):
Suppose that f is a real function on the closed interval [c, d] and n is a positive integer.
Furthermore, suppose that the n+1st derivative of f is everywhere finite on the open
interval (c, d) and that the nth derivative of f is continuous on the closed interval [c, d].
Let a be any point in the interval [c, d]. Then for every point x in [c, d] with x not equal to a
there exists a point
which is strictly between x and a, such that the following formula holds:
=
where the error is given by
=
Definition:
A real-valued function f defined on an interval [c, d] is said to belong to
on [c, d]
provided that f has derivatives of all orders defined at every point of [c, d].
Note:
This means that all of the following derivatives exist on [c, d]:
,
,
,...
Definition (of a Taylor's Series):
If f belongs to
on [c, d]
and we pick a point a such that c < a < d, then the Taylor's
series about a generated by f is the following power series:
Note:
If a = 0, the Taylor's series is called a Maclaurin series. It simplifies to:
Note: It must always be asked whether a given Taylor's series converges and whether it
converges to f(x). The following theorem helps to answer this.
Corollary (Taylor's Series Convergence):
If f belongs to
on [c, d]
and we pick a point a such that c < a < d, then the Taylor's
series converges to f(x) if and only if
=
(where the error (remainder) is defined as shown above)
Note: A typical way to show that this limit is zero is to show that the derivatives of f are bounded,
at least in some interval about a.
Note: If we know that a Taylor's series converges to f(x) then we have that:
=
=
If a = 0, this simplifies to:
=
=
Note: Since we cannot do an infinite sum on the computer, we typically compute a partial sum
and use Taylor's formula with Remainder to try to produce a bound on how bad the error can be.
The actual error, of course, might be much less that our bound. The bound provides a worst case
for the error.