next up previous
Next: Approximate Numerical Methods for Up: Solutions of Equations Previous: Solutions of Equations

Revision of Taylor Series

Many of the numerical methods used in this course make use of a Taylor series either for deriving the formula to use or for estimating the error of the numerical solution. The function $f(x)$ can be expanded in a power series in power of $(x - x_{0})$ as
\begin{displaymath}
f(x) = f(x_{0}) + (x - x_{0})f^{\prime}(x_{0}) + (x -
x_{...
... \cdots + (x -
x_{0})^{n}{f^{(n)}(x_{0})\over n!} + R_{n+1},
\end{displaymath} (1.1)

where the error is given by
\begin{displaymath}
R_{n+1} = (x - x_{0})^{n+1}{f^{(n+1)}(c)\over (n+1)!}, \qquad x_{0} <
c < x.
\end{displaymath} (1.2)

Note that $c$ lies between the values of $x_{0}$ and $x$. Here we have assumed that $x_{0} < x$. In (1.1) we are defining the $nth$ derivative, evaluated at the point $x_{0}$ by

\begin{displaymath}
f^{(n)}(x_{0}) = {d^{n}f\over dx^{n}} \vert _{x = x_{0}}.
\end{displaymath}

Finally, we have assumed that the function $f(x)$ is at least $n+1$ times differentiable. In this course we will assume that all our function will possess suitable number of derivatives. We will not be concerned with functions like $x^{1/2}$ which does not have a derivative at $x=0$.

As an example, if $f(x) = x^{3} + 2x^{2} - 3x + 1$, then

\begin{displaymath}
f^{\prime\prime}(x) = 6x + 4,
\end{displaymath}

and so

\begin{displaymath}
f^{\prime\prime}(x_{0}) = 6x_{0} + 4.
\end{displaymath}

Suppose $x_{0} = 2$, then

\begin{displaymath}
f^{\prime\prime}(x_{0}) = 12 + 4 = 16.
\end{displaymath}

If we set

\begin{displaymath}
x = x_{0} + h,
\end{displaymath}

then an equivalent expression to (1.1) is given by
\begin{displaymath}
f(x) = f(x_{0}+h) = f(x_{0}) + h f^{\prime}(x_{0}) +
h^{2...
...over 2!} + \cdots +
h^{n}{f^{(n)}(x_{0})\over n!} + R_{n+1},
\end{displaymath} (1.3)

where the error term is
\begin{displaymath}
R_{n+1} = h^{n+1}{f^{(n+1)}(c)\over (n+1)!}, \qquad x_{0} < c < x_{0}+h.
\end{displaymath} (1.4)

You need to learn this formula and be able to use it confidently. It is useful when you need to approximate the function in the neighbourhood of a particular point.

Example 1. .1Expand $\cos x$ about $x=0$ so that $x_{0} = 0$. Remember to use radians when caculus is involved.

Table 1.1:
$f(x) = \cos x$ $f(0) = 1$
$f^{\prime}(x)= - \sin x$ $f^{\prime}(0) = 0 $
$f^{\prime \prime}(x) = - \cos x$ $f^{\prime \prime}(0) = - 1 $
$f^{\prime\prime\prime}(x) = \sin x$ $f^{\prime\prime\prime}(0) = 0$
$f^{iv}(x) = \cos x$ $f^{iv}(0) = 1$

Therefore, from (1.1), the Taylor series for $\cos x$ about $x=0$ is

\begin{displaymath}
\cos x = 1 - {x^{2}\over 2!} + {x^{4}\over 4!} + \cdots = 1 - {1\over
2}x^{2} + {1\over 24}x^{4} + \cdots
\end{displaymath}

As an example of the use of a Taylor series, we can use this to estimate the first positive zero of $\cos x$. Obviously the answer is $\pi/2 = 1.5708$ but for other functions this will not be known and the Taylor series can sometimes provide a useful first estimate.

If we only take the first two non-zero terms then we approximate $\cos x$ by

\begin{displaymath}
\cos x \approx 1 - {1\over 2}x^{2} = 0, \qquad x^{2} = 2.
\end{displaymath}

Hence, the first positive zero is approximately

\begin{displaymath}
x = \sqrt{2} = 1.4142.
\end{displaymath}

If we now take the first three non-zero terms then we have

\begin{displaymath}
\cos x \approx 1 - {1\over 2}x^{2} + {1\over 24} x^{4} = 0.
\end{displaymath}

Now we can solve the quadratic in $x^{2}$

\begin{eqnarray*}
x^{4} & - & 12x + 24 = 0 \\
x^{2} & = & {12 \pm \sqrt{(12^{...
...= 6 \pm
\sqrt{12}, \\
x & = & \sqrt{6 - \sqrt{12}} = 1.5924.
\end{eqnarray*}



Why did we take the negative square root? Obviously we are looking for the first positive zero and the positive square root will give an estimate for the second root at $3\pi/2$.

Can we do any better than this? Since $1.5924$ is an estimate for the first zero, we could try and form the Taylor series about 1.59 instead of about 0. I could have used 1.5924 but 1.59 will do just as well. We now form the table as shown in Table 1.2.

Table 1.2:
$f(x) = \cos x$ $f(1.59) = -0.0192$
$f^{\prime}(x)= - \sin x$ $f^{\prime}(1.59) = - 0.9998 $

The Taylor series is

\begin{displaymath}
\cos x = -0.0192 - (x - 1.59)0.9998 + \cdots
\end{displaymath}

If $\cos x \approx 0$, then setting the left hand side equal to 0 and rearranging gives

\begin{displaymath}
x - 1.59 = -{0.0192\over 0.9998} = -0.0192.
\end{displaymath}

Hence,

\begin{displaymath}
x = 1.59 - 0.0192 = 1.5708.
\end{displaymath}

Not a bad estimate. In fact, we have actually undertaken one step of the Newton-Raphson method and that will be described in detail later.

Example 1. .2Expand $(1 + x)^{1/2}$ about $x=0$.

Table 1.3:
$f(x) = (1+x)^{1/2}$ $f(0) = 1$
$f^{\prime}(x) = {1\over 2}(1+x)^{-1/2}$ $f^{\prime}(0) = {1\over
2}$
$f^{\prime\prime}(x) = -{1\over 4}(1+x)^{-3/2}$ $f^{\prime\prime}(0) = -{1\over 4}$

Hence, the Taylor series about $x=0$ is

\begin{displaymath}
(1+x)^{1/2} = 1 + {x\over2} -{x^{2}\over 4\times 2!} + \cdots = 1 +
{x\over2} - {x^{2}\over 8} + \cdots
\end{displaymath}

We can use this approximation to estimate the value of $\sqrt{1.2}$. Here we take $x = 0.2$ so that $1.2 = 1 + 0.2$. Thus,

\begin{displaymath}
\sqrt{1.2} = 1.0954 \approx 1 + {0.2\over 2} - {(0.2)^{2}\over 8} =
1 + 0.1 - 0.005 = 1.095.
\end{displaymath}


next up previous
Next: Approximate Numerical Methods for Up: Solutions of Equations Previous: Solutions of Equations
Prof. Alan Hood
2000-02-01