next up previous
Next: 3. Runge-Kutta Method Up: 2. Error Estimates and Previous: 2. Error Estimates and

2.1 Theorem:

Let $ y_i \,\,\,(1\leq i \leq n)$ be the approximations of a solution y of (2.1) at $ x_i$.Let the (exact)solution y be twice continually differentiable on [a,b], $ (x_0 = a,\,\, x_0 + nh =
b)$. Further let

$\displaystyle \vert f_y(x,y)\vert\leq L\,\,$    and $\displaystyle \,\ \vert y''(x)\vert\leq \vert M\vert, \, \, \, a\leq x \leq b$

where L and M are positive constants. Then

$\displaystyle \vert e_i\vert \leq \frac{hM}{2L}(e^{ihL}-1)$ (2.3)

Proof:
By the mean value we have,

$\displaystyle y(x_{i+1})=y(x_i)+hy'(x_i)+\frac{h^2}{2!}y''c),\,\,\, x_i \leq c\leq x_{i+1}$

we also know,

$\displaystyle y_{i+1}=y_i+hf(x_i,y_i)$

Substraction, now leads to,

$\displaystyle e_{i+1}=e_i+h\{f(x_i,y(x_i))-f(x_i,y_i)\}+\frac{h^2}{2!}y''i)$ (2.4)

Again, by mean value theorem

$\displaystyle f(x_i,y(x_i))-f(x_i,y_i)=f_y(x_i,d)(y(x_i)-y_i)$

$\displaystyle \qquad\qquad\qquad = f_y (x_i, d). e_i$

where d lies between $ y_i$ and $ y(x_i).$ Substituting in (2.4)we get

$\displaystyle \vert e_{i+1}\vert\leq \vert e_i\vert+h \,$   L$\displaystyle \vert e_i\vert+\frac{h^2}{2}M$ (2.5)

Let $ g_{i+1}$ be the solution of the difference equation

$\displaystyle g_{i+1}=(1+hL)gi + \frac{ h^2}{2}M, \,\, g_0 =0..$ (2.6)

Claim:

$\displaystyle g_i \geq \vert e_i\vert, \,\,\, i=1,2,...n$

The claim follows by induction. Also by induction

$\displaystyle g_i=A(1+hL)^i-A, \,\,\, A=\frac{hM}{2L}$

$\displaystyle \qquad\qquad\qquad \qquad \leq A (e^{nhi}-1)$

$\displaystyle \qquad\qquad\qquad \qquad \quad\,\,\,\,\,\,=\frac{hM}{2L}(e^{(x_i -a)}-1)$

The theorem now is proved once we notice $ \vert e_i\vert\leq
d_i,\,\,\,i=1,2,....n$
Remark: Inequality (2.3) implies that the error is $ O(h^2)$. Theorem also given an upper band for the error.
next up previous
Next: 3. Runge-Kutta Method Up: 2. Error Estimates and Previous: 2. Error Estimates and
root 2006-02-16