Mathematical theory of adaptive control
Author(s)
Bibliographic Information
Mathematical theory of adaptive control
(Interdisciplinary mathematical sciences, v. 4)
World Scientific, c2006
Available at / 4 libraries
-
No Libraries matched.
- Remove all filters.
Note
Assistant editors: Ł. Stettner and J. Zabczyk
Bibliography: p. 459-470
Includes index
Description and Table of Contents
Description
The theory of adaptive control is concerned with construction of strategies so that the controlled system behaves in a desirable way, without assuming the complete knowledge of the system. The models considered in this comprehensive book are of Markovian type. Both partial observation and partial information cases are analyzed. While the book focuses on discrete time models, continuous time ones are considered in the final chapter. The book provides a novel perspective by summarizing results on adaptive control obtained in the Soviet Union, which are not well known in the West. Comments on the interplay between the Russian and Western methods are also included.
Table of Contents
# Basic Notions and Definitions # Real-Valued HPIV with Finite Number of Controls: Automaton Approach # Stochastic Approximation # Minimax Adaptive Control # Controlled Finite Homogeneous Markov Chains # Control of Partially Observable Markov Chains and Regenerative Processes # Control of Markov Processes with Discrete Time and Semi-Markov Processes # Control of Stationary Processes # Finite-Converging Procedures for Control Problems with Inequalities # Control of Linear Difference Equations # Control of Ordinary Differential Equations # Control of Stochastic Differential Equations
by "Nielsen BookData"