On the minimax feedback control of uncertain dynamic systems

Dimitri P. Bertsekas, Ian B. Rhodes

Research output: Contribution to journalConference articlepeer-review

7 Scopus citations


In this paper the problem of optimal feedback control of uncertain discrete-time dynamic systems is considered where the uncertain quantities do not have a stochastic description but instead are known to belong to given sets. The problem is converted to a sequential minimax problem and dynamic programming is suggested as a general method for its solution. The notion of a sufficiently informative function, which parallels the notion of a sufficient statistic of stochastic optimal control, is introduced, and conditions under which the optimal controller decomposes into an estimator and an actuator are identified.

Original languageEnglish (US)
Article number4044796
Pages (from-to)451-455
Number of pages5
JournalProceedings of the IEEE Conference on Decision and Control
StatePublished - 1971
Externally publishedYes
Event1971 IEEE Conference on Decision and Control, CDC 1971 - Including the 10th Symposium on Adaptive Process - Miami Beach, United States
Duration: Dec 15 1971Dec 17 1971

ASJC Scopus subject areas

  • Control and Systems Engineering
  • Modeling and Simulation
  • Control and Optimization


Dive into the research topics of 'On the minimax feedback control of uncertain dynamic systems'. Together they form a unique fingerprint.

Cite this