Initial Value Problems

As we had seen, there are no methods to solve a general equation of the form

$\displaystyle y^\prime = f(x,y)$ (7.6.1)

and in this context two questions may be pertinent.
  1. Does (7.6.1) admit solutions at all (i.e., the existence problem)?
  2. Is there a method to find solutions of (7.6.1) in case the answer to the above question is in the affirmative?

The answers to the above two questions are not simple. But there are partial answers if some additional restrictions on the function $ f$ are imposed. The details are discussed in this section.

For $ a, b \in {\mathbb{R}}$ with $ a > 0, b > 0,$ we define

$\displaystyle S= \{(x,y) \in {\mathbb{R}}^2: \vert x - x_0\vert \leq a, \; \vert y - y_0\vert \leq b \} \subset
I \times {\mathbb{R}}.$

DEFINITION 7.6.1 (Initial Value Problems)   Let $ f: S \longrightarrow {\mathbb{R}}$ be a continuous function on a $ S.$ The problem of finding a solution $ y$ of
$\displaystyle y^\prime = f(x,y), \; (x, y ) \in
S, x \in I \;\; {\mbox { with }} \;\; y(x_0) = y_0$     (7.6.2)

in a neighbourhood $ I$ of $ x_0$ (or an open interval $ I$ containing $ x_0$ ) is called an Initial Value Problem, henceforth denoted by IVP.

The condition $ y(x_0) = y_0$ in (7.6.2) is called the INITIAL CONDITION stated at $ x = x_0$ and $ \;y_0$ is called the INITIAL VALUE.

Further, we assume that $ a$ and $ b$ are finite. Let

$\displaystyle M = \max\{\vert f(x,y)\vert: (x,y) \in S \}.$

Such an $ M$ exists since $ S$ is a closed and bounded set and $ f$ is a continuous function and let $ h = \min (a, \frac{b}{M}).$ The ensuing proposition is simple and hence the proof is omitted.

PROPOSITION 7.6.2   A function $ y$ is a solution of IVP (7.6.2) if and only if $ y$ satisfies

$\displaystyle y = y_0 + \int_{x_0}^x f(s, y(s)) ds.$ (7.6.3)

In the absence of any knowledge of a solution of IVP (7.6.2), we now try to find an approximate solution. Any solution of the IVP (7.6.2) must satisfy the initial condition $ y(x_0) = y_0.$ Hence, as a crude approximation to the solution of IVP (7.6.2), we define

$\displaystyle y_0 = y_0 \;
{\mbox{ for all }} \; x \in [x_0 - h, x_0 + h].$

Now the Equation (7.6.3) appearing in Proposition 7.6.2, helps us to refine or improve the approximate solution $ y_0$ with a hope of getting a better approximate solution. We define

$\displaystyle y_1 = y_o + \int_{x_0}^x f(s, y_0) ds$

and for $ n = 2, 3,
\ldots,$ we inductively define

$\displaystyle y_n = y_0 + \int_{x_0}^x f(s, y_{n-1}(s)) ds
{\mbox{ for all }} \; x \in [x_0 - h, x_0 + h].$

As yet we have not checked a few things, like whether the point $ (s, y_n(s)) \in S$ or not. We formalise the theory in the latter part of this section. To get ourselves motivated, let us apply the above method to the following IVP.

EXAMPLE 7.6.3   Solve the IVP

$\displaystyle y^\prime = -y, \; y(0) = 1, \; -1 \le x \le 1.$


Solution: From Proposition 7.6.2, a function $ y$ is a solution of the above IVP if and only if

$\displaystyle y = 1 - \int_{x_0}^x y(s) ds.$

We have $ y_0 = y(0) \equiv 1$ and

$\displaystyle y_1 = 1 - \int_0^x ds = 1 - x.$

So,

$\displaystyle y_2 = 1 - \int_0^x (1-s) ds = 1 - x + \frac{x^2}{2!}.$

By induction, one can easily verify that

$\displaystyle y_n = 1 - x + \frac{x^2}{2!} - \frac{x^3}{3!} +
\cdots + (-1)^n \frac{ x^n}{n!}.$

Note: The solution of the given IVP is

$\displaystyle y = e^{-x} \;\;
{\mbox{ and that }} \;\; \lim_{n {\longrightarrow}\infty} y_n = e^{-x}.$

This example justifies the use of the word approximate solution for the $ y_n$ 's.

We now formalise the above procedure.

DEFINITION 7.6.4 (Picard's Successive Approximations)   Consider the IVP (7.6.2). For $ x \in I$ with $ \vert x - x_0\vert \leq a,$ define inductively
$\displaystyle y_0(x)$ $\displaystyle =$ $\displaystyle y_0 \;\; {\mbox{ and for }}\; n=1, 2, \ldots,$  
$\displaystyle y_n(x)$ $\displaystyle =$ $\displaystyle y_0 + \int_{x_0}^x f(s, y_{n-1}(s)) ds.$ (7.6.4)

Then $ y_0, y_1, \ldots, y_n, \ldots$ are called Picard's successive approximations to the IVP (7.6.2).

Whether (7.6.4) is well defined or not is settled in the following proposition.

PROPOSITION 7.6.5   The Picard's approximates $ y_n$ 's, for the IVP (7.6.2) defined by (7.6.4) is well defined on the interval $ \vert x - x_0\vert \leq h = \min \{a, \frac{b}{M} \},$ i.e., for $ x \in [x_0 - h, x_0 + h].$

Proof. We have to verify that for each $ n = 0, 1, 2, \ldots,$ $ (s, y_n)$ belongs to the domain of definition of $ f$ for $ \vert s -
x_0\vert \leq h.$ This is needed due to the reason that $ f(s,
y_n)$ appearing as integrand in (7.6.4) may not be defined. For $ n = 0,$ it is obvious that $ f(s, y_0) \in
S$ as $ \vert s - x_0\vert \leq a$ and $ \vert y_0 - y_0\vert= 0 \leq b.$ For $ n = 1,$ we notice that, if $ \vert x - x_0\vert \leq h$ then

$\displaystyle \vert y_1 - y_0\vert \leq M \vert x- x_0\vert \leq M h \leq b.$

So, $ (x, y_1) \in S$ whenever $ \vert x - x_0\vert \leq h.$

The rest of the proof is by the method of induction. We have established the result for $ n = 1,$ namely

$\displaystyle (x, y_1) \in S \;\; {\mbox{ if }}
\;\; \vert x - x_0\vert \le h.$

Assume that for $ k=1, 2, \ldots, n-1,$ $ (x, y_k)\in S$ whenever $ \vert x - x_0\vert \le h.$ Now, by definition of $ y_n,$ we have

$\displaystyle y_n - y_0 = \int_{x_0}^x f(s, y_{n-1}) ds.$

But then by induction hypotheses $ (s, y_{n-1}) \in S$ and hence

$\displaystyle \vert y_n - y_0 \vert \le M \vert x - x_0\vert \leq M h \le b.$

This shows that $ (x, y_n ) \in S$ whenever $ \vert x - x_0\vert \le h.$ Hence $ (x, y_k)\in S$ for $ k=n$ holds and therefore the proof of the proposition is complete. height6pt width 6pt depth 0pt

Let us again come back to Example 7.6.3 in the light of Proposition 7.6.2.

EXAMPLE 7.6.6   Compute the successive approximations to the IVP

$\displaystyle y^\prime = - y, \;\; -1 \leq x \leq 1, \; \vert y-1\vert \leq 1 {\mbox{ and }} y(0) = 1.$ (7.6.5)


Solution: Note that $ x_0 = 0, y_0 = 1, f(x,y) = -y,$ and $ a = b = 1.$ The set $ S$ on which we are studying the differential equation is

$\displaystyle S = \{(x,y) : \vert x\vert \leq 1, \vert y-1\vert \leq 1 \}.$

By Proposition 7.6.2, on this set

$\displaystyle M = \max \{ \vert y\vert : (x,y) \in S \} = 2 \;\; {\mbox{ and }} \;\;
h = \min \{ 1, 1/2\} = 1/2.$

Therefore, the approximate solutions $ y_n$ 's are defined only for the interval $ [-\displaystyle\frac{1}{2}, \displaystyle\frac{1}{2} ],$ if we use Proposition 7.6.2.

Observe that the exact solution $ y = e^{-x}$ and the approximate solutions $ y_n$ 's of Example 7.6.3 exist on $ [-1, 1].$ But the approximate solutions as seen above are defined in the interval $ [-\displaystyle\frac{1}{2}, \displaystyle\frac{1}{2} ].$

That is, for any IVP, the approximate solutions $ y_n$ 's may exist on a larger interval as compared to the interval obtained by the application of the Proposition 7.6.2.

We now consider another example.

EXAMPLE 7.6.7   Find the Picard's successive approximations for the IVP

$\displaystyle y^\prime = f(y) , \;\; 0 \leq x \leq 1, \; y \geq 0 {\mbox{ and }} y(0) = 0;$ (7.6.6)

where

$\displaystyle f(y) = \sqrt{y} \; {\mbox{ for }} y \ge 0.$


Solution: By definition $ y_0(x)=y_0 \equiv 0$ and

$\displaystyle y_1(x) = y_0 + \int_0^x f(y_0) ds = 0 + \int_0^x \sqrt{0} ds = 0.$

A similar argument implies that $ y_n(x) \equiv 0$ for all $ n= 2, 3, \ldots$ and $ \lim\limits_{n \longrightarrow \infty} y_n(x) \equiv 0.$ Also, it can be easily verified that $ y(x) \equiv 0$ is a solution of the IVP (7.6.6).

Also $ y(x) = \displaystyle\frac{x^2}{4}, \; 0 \leq x \leq 1$ is a solution of (7.6.6) and the $ \{y_n\}$ 's do not converge to $ \displaystyle\frac{x^2}{4}.$ Note here that the IVP (7.6.6) has at least two solutions.

The following result is about the existence of a unique solution to a class of IVPs. We state the theorem without proof.

THEOREM 7.6.8 (Picard's Theorem on Existence and Uniqueness)   Let $ S = \{(x,y) : \vert x - x_0\vert \leq a, \; \vert y - y_0\vert \leq b \},$ and $ a, b > 0.$ Let $ f : S {\longrightarrow}{\mathbb{R}}$ be such that $ f$ as well as $ \displaystyle\frac{\partial f}{\partial y}$ are continuous on $ S.$ Also, let $ M, K \in {\mathbb{R}}$ be constants such that

$\displaystyle \vert f\vert \leq M, \; \vert\frac{\partial f}{\partial y}\vert \leq K \; {\mbox{ on }} \; S.$

Let $ h = \min \{a, b/M\}.$ Then the sequence of successive approximations $ \{y_n\}$ (defined by (7.6.4)) for the IVP (7.6.2) uniformly converges on $ \vert x - x_0\vert \leq h$ to a solution of IVP (7.6.2). Moreover the solution to IVP (7.6.2) is unique.

Remark 7.6.9   The theorem asserts the existence of a unique solution on a subinterval $ \vert x - x_0\vert \leq h$ of the given interval $ \vert x - x_0\vert
\leq a.$ In a way it is in a neighbourhood of $ x_0$ and so this result is also called the local existence of a unique solution. A natural question is whether the solution exists on the whole of the interval $ \vert x - x_0\vert
\leq a.$ The answer to this question is beyond the scope of this book.

Whenever we talk of the Picard's theorem, we mean it in this local sense.

EXERCISE 7.6.10  
  1. Compute the sequence $ \{y_n\}$ of the successive approximations to the IVP

    $\displaystyle y^\prime = y \; (y-1), \; y(x_0) = 0,
x_0 \geq 0.$

  2. Show that the solution of the IVP

    $\displaystyle y^\prime
= y \; (y-1), \; y(x_0) = 1, x_0 \geq 0$

    is $ y \equiv 1, \; x
\geq x_0.$
  3. The IVP

    $\displaystyle y^\prime = \sqrt{y}, \;
y(0) = 0, x \geq 0$

    has solutions $ y_1 \equiv 0$ as well as $ y_2 = \displaystyle\frac{x^2}{4}, x \geq 0.$ Why does the existence of the two solutions not contradict the Picard's theorem?
  4. Consider the IVP

    $\displaystyle y^\prime = y, \; y(0) = 1 \; {\mbox{ in }}
\{(x,y ) : \vert x\vert \leq a, \vert y\vert \leq b \}$

    for any $ a, b > 0.$
    1. Compute the interval of existence of the solution of the IVP by using Theorem 7.6.8.
    2. Show that $ y = e^x$ is the solution of the IVP which exists on whole of $ {\mathbb{R}}.$
    This again shows that the solution to an IVP may exist on a larger interval than what is being implied by Theorem 7.6.8.



Subsections
A K Lal 2007-09-12