Theory of a Class of Discrete Optimal Control Systems†
- 1 December 1964
- journal article
- control section
- Published by Taylor & Francis in Journal of Electronics and Control
- Vol. 17 (6), 697-711
- https://doi.org/10.1080/00207216408937740
Abstract
The problem considered in this paper is that of finding optimal controls for a class of fixed duration processes in systems described by non-linear difference equations. The discrete versions of the adjoint system and the Hamiltonian are used in conjunction with the original techniques found in the proofs of the Pontryagin maximum principle to derive conditions necessary for a control to be optimal. These necessary conditions are shown to be related to the Pontryagin conditions for continuous systems in the following manner: the requirement of a global maximum of a Hamiltonian becomes a condition of a local maximum or of stationarity, while the transversality conditions remain identical.Keywords
This publication has 1 reference indexed in Scilit:
- A Discrete Version of Pontrjagin's Maximum Principle†Journal of Electronics and Control, 1962