Stability of discrete systems over a finite interval of time†

Abstract
In many cases of practical interest there is concern with the behaviour of dynamic systems only over a finite time interval. This concern may arise in one of two ways: In one case the system under consideration is defined over a fixed and finite interval of time, while in the second case the system in question is defined for all time; however, the behaviour of the system is of interest only over a finite time interval. Recently, Weiss and Infante (1965, 1967) treated the problem of system stability over a finite time interval for the ease of continuous systems. In this paper a theory is developed which concerns itself with the stability of discrete systems over a finite interval of time. The dynamic systems which are considered are general enough so as to include unforced systems, systems under the influence of perturbing forces, linear systems, non-linear systems, time invariant systems, time-varying systems, simple systems and composite systems. In the present development various definitions of stability are considered and corresponding stability theorems are stated and proved. These theorems yield sufficient conditions for stability and in general involve the existence of Lyapunov-like functions which do not possess the usual definiteness requirements on V and ΔV.

This publication has 2 references indexed in Scilit: