Theory of a Class of Discrete Optimal Control Systems†

Abstract
The problem considered in this paper is that of finding optimal controls for a class of fixed duration processes in systems described by non-linear difference equations. The discrete versions of the adjoint system and the Hamiltonian are used in conjunction with the original techniques found in the proofs of the Pontryagin maximum principle to derive conditions necessary for a control to be optimal. These necessary conditions are shown to be related to the Pontryagin conditions for continuous systems in the following manner: the requirement of a global maximum of a Hamiltonian becomes a condition of a local maximum or of stationarity, while the transversality conditions remain identical.

This publication has 1 reference indexed in Scilit: