Mathematical theory of control systems design

著者

書誌事項

Mathematical theory of control systems design

by V.N. Afanas'ev, V.B. Kolmanovskii, and V.R. Nosov

(Mathematics and its applications, v. 341)

Kluwer Academic, c1996

大学図書館所蔵 件 / 30

この図書・雑誌をさがす

注記

Includes bibliographical references (p. 657-661) and index

内容説明・目次

内容説明

Give, and it shall be given unto you. ST. LUKE, VI, 38. The book is based on several courses of lectures on control theory and appli cations which were delivered by the authors for a number of years at Moscow Electronics and Mathematics University. The book, originally written in Rus sian, was first published by Vysshaya Shkola (Higher School) Publishing House in Moscow in 1989. In preparing a new edition of the book we planned to make only minor changes in the text. However, we soon realized that we like many scholars working in control theory had learned many new things and had had many new insights into control theory and its applications since the book was first published. Therefore, we rewrote the book especially for the English edition. So, this is substantially a new book with many new topics. The book consists of an introduction and four parts. Part One deals with the fundamentals of modern stability theory: general results concerning stability and instability, sufficient conditions for the stability of linear systems, methods for determining the stability or instability of systems of various type, theorems on stability under random disturbances.

目次

Preface. Introduction. Part One: Stability of control systems. I. Continuous and discrete deterministic systems. II. Stability of stochastic systems. Part Two: Control of deterministic systems. III. Description of control problems. IV. The classical calculus of variations and optimal control. V. The maximum principle. VI. Linear control systems. VII. Dynamic programming approach. Sufficient conditions for optimal control. VIII. Some additional topics of optimal control theory. Part Three: Optimal control of dynamical systems under random disturbances. IX. Control of stochastic systems. Problem statements and investigation techniques. X. Optimal control on a time level of random duration. XI. Optimal estimation of the state of the system. XII. Optimal control of the observation process. Part Four: Numerical methods in control systems. XIII. Linear time-invariant control systems. XIV. Numerical methods for the investigation of nonlinear control systems. XV. Numerical design of optimal control systems. General references. Subject index.

「Nielsen BookData」 より

関連文献: 1件中  1-1を表示

詳細情報

  • NII書誌ID(NCID)
    BA27163779
  • ISBN
    • 0792337247
  • LCCN
    95020901
  • 出版国コード
    ne
  • タイトル言語コード
    eng
  • 本文言語コード
    eng
  • 出版地
    Dordrecht
  • ページ数/冊数
    xxiii, 668 p.
  • 大きさ
    25 cm
  • 分類
  • 件名
  • 親書誌ID
ページトップへ