Optimal controller switching for stochastic systems

Abstract
Presents a solution to certain problems in switched controller design for stochastic dynamical systems. The main result is a separation theorem for partial information systems. This result is then used to convert the partial information stochastic control problem to a complete information stochastic control problem. We also show that certainty equivalence does not hold. The optimal sequence of controllers can be determined via an appropriate solution to a dynamic programming problem.

This publication has 4 references indexed in Scilit: