Mathematical theory of adaptive control
著者
書誌事項
Mathematical theory of adaptive control
(Interdisciplinary mathematical sciences, v. 4)
World Scientific, c2006
大学図書館所蔵 件 / 全4件
-
該当する所蔵館はありません
- すべての絞り込み条件を解除する
注記
Assistant editors: Ł. Stettner and J. Zabczyk
Bibliography: p. 459-470
Includes index
内容説明・目次
内容説明
The theory of adaptive control is concerned with construction of strategies so that the controlled system behaves in a desirable way, without assuming the complete knowledge of the system. The models considered in this comprehensive book are of Markovian type. Both partial observation and partial information cases are analyzed. While the book focuses on discrete time models, continuous time ones are considered in the final chapter. The book provides a novel perspective by summarizing results on adaptive control obtained in the Soviet Union, which are not well known in the West. Comments on the interplay between the Russian and Western methods are also included.
目次
# Basic Notions and Definitions # Real-Valued HPIV with Finite Number of Controls: Automaton Approach # Stochastic Approximation # Minimax Adaptive Control # Controlled Finite Homogeneous Markov Chains # Control of Partially Observable Markov Chains and Regenerative Processes # Control of Markov Processes with Discrete Time and Semi-Markov Processes # Control of Stationary Processes # Finite-Converging Procedures for Control Problems with Inequalities # Control of Linear Difference Equations # Control of Ordinary Differential Equations # Control of Stochastic Differential Equations
「Nielsen BookData」 より