Abstract
In this paper the problem of optimal feedback control of uncertain discrete-time dynamic systems is considered where the uncertain quantities do not have a stochastic description but instead are known to belong to given sets. The problem is converted to a sequential minimax problem and dynamic programming is suggested as a general method for its solution. The notion of a sufficiently informative function, which parallels the notion of a sufficient statistic of stochastic optimal control, is introduced, and conditions under which the optimal controller decomposes into an estimator and an actuator are identified.
Original language | English (US) |
---|---|
Article number | 4044796 |
Pages (from-to) | 451-455 |
Number of pages | 5 |
Journal | Proceedings of the IEEE Conference on Decision and Control |
DOIs | |
State | Published - 1971 |
Externally published | Yes |
Event | 1971 IEEE Conference on Decision and Control, CDC 1971 - Including the 10th Symposium on Adaptive Process - Miami Beach, United States Duration: Dec 15 1971 → Dec 17 1971 |
ASJC Scopus subject areas
- Control and Systems Engineering
- Modeling and Simulation
- Control and Optimization