2 Definitions Related to Stability for A Generic System
We know that a general time invariant system (linear or nonlinear) with no external input can be modeled by the following equation
| ( 2 ) |
Equilibrium Point: The equilibrium point or equilibrium state of a system is that point in the state space where the dynamics of the system is zero which implies that the states will remain there forever once brought.
Thus the equilibrium points are the solutions of the following equation.
| ( 3 ) |
One should note that since an LTI system with no external input can be modeled by
,
is the only equilibrium point for such a system.
Nonlinear systems can have multiple equilibrium points. Thus when we talk about the stability of a nonlinear system, we do so with respect to the equilibrium points.
For convenience, we state all definitions for the case when the equilibrium point is at the origin. There is no loss of generality if we do so because any equilibrium point can be shifted to origin via a change of variables.
Example:
Find out the equilibrium points of the following nonlinear system.
Equating
and
to 0 , we get
always.
can take 3 values, which are 0 , 1 and -1 respectively. Thus the sytem has three equilibrium points, located at (0,0), (1,0) and (-1,0) respectively.
Stability in the sense of Lyapunov: The equilibrium point x = 0 of (2) is stable if, for each ∈ > 0, there exists a
such that
![]()