Optimum Control of Distributed-Parameter Systems

Abstract
This paper presents a general discussion of the optimum control of distributed-parameter dynamical systems. The main areas of discussion are: (a) The mathematical description of distributed parameter systems, (b) the controllability and observability of these systems, (c) the formulation of optimum control problems and the derivation of a maximum principle for a particular class of systems, and (d) the problems associated with approximating distributed systems by discretization. In order to illustrate the applicability of certain general results and manifest some of the properties which are intrinsic to distributed systems, specific results are obtained for a simple, one-dimensional, linear-diffusion process.