Bibliographic Information

Controlled diffusion processes

N.V. Krylov ; translated by A.B. Aries ; [editor, A.V. Balakrishnan]

(Applications of mathematics, 14)

Springer-Verlag, c1980

  • : us
  • : gw

Available at  / 74 libraries

Search this Book/Journal

Note

Translation of Upravli︠a︡emye prot︠s︡essy diffuzionnogo tipa

Bibliography: p. 303-306

Includes index

Description and Table of Contents

Description

Stochastic control theory is a relatively young branch of mathematics. The beginning of its intensive development falls in the late 1950s and early 1960s. During that period an extensive literature appeared on optimal stochastic control using the quadratic performance criterion (see references in W onham [76J). At the same time, Girsanov [25J and Howard [26J made the first steps in constructing a general theory, based on Bellman's technique of dynamic programming, developed by him somewhat earlier [4J. Two types of engineering problems engendered two different parts of stochastic control theory. Problems of the first type are associated with multistep decision making in discrete time, and are treated in the theory of discrete stochastic dynamic programming. For more on this theory, we note in addition to the work of Howard and Bellman, mentioned above, the books by Derman [8J, Mine and Osaki [55J, and Dynkin and Yushkevich [12]. Another class of engineering problems which encouraged the development of the theory of stochastic control involves time continuous control of a dynamic system in the presence of random noise. The case where the system is described by a differential equation and the noise is modeled as a time continuous random process is the core of the optimal control theory of diffusion processes. This book deals with this latter theory.

Table of Contents

1 Introduction to the Theory of Controlled Diffusion Processes.- 1. The Statement of Problems-Bellman's Principle-Bellman's Equation.- 2. Examples of the Bellman Equations-The Normed Bellman Equation.- 3. Application of Optimal Control Theory-Techniques for Obtaining Some Estimates.- 4. One-Dimensional Controlled Processes.- 5. Optimal Stopping of a One-Dimensional Controlled Process.- Notes.- 2 Auxiliary Propositions.- 1. Notation and Definitions.- 2. Estimates of the Distribution of a Stochastic Integral in a Bounded Region.- 3. Estimates of the Distribution of a Stochastic Integral in the Whole Space.- 4. Limit Behavior of Some Functions.- 5. Solutions of Stochastic Integral Equations and Estimates of the Moments.- 6. Existence of a Solution of a Stochastic Equation with Measurable Coefficients.- 7. Some Properties of a Random Process Depending on a Parameter.- 8. The Dependence of Solutions of a Stochastic Equation on a Parameter.- 9. The Markov Property of Solutions of Stochastic Equations.- 10. Ito's Formula with Generalized Derivatives.- Notes.- 3 General Properties of a Payoff Function.- 1. Basic Results.- 2. Some Preliminary Considerations.- 3. The Proof of Theorems 1.5-1.7.- 4. The Proof of Theorems 1.8-1.11 for the Optimal Stopping Problem.- Notes.- 4 The Bellman Equation.- 1. Estimation of First Derivatives of Payoff Functions.- 2. Estimation from Below of Second Derivatives of a Payoff Function.- 3. Estimation from Above of Second Derivatives of a Payoff Function.- 4. Estimation of a Derivative of a Payoff Function with Respect to t.- 5. Passage to the Limit in the Bellman Equation.- 6. The Approximation of Degenerate Controlled Processes by Nondegenerate Ones.- 7. The Bellman Equation.- Notes.- 5 The Construction of ?-Optimal Strategies.- 1. ?-Optimal Markov Strategies and the Bellman Equation.- 2. ?-Optimal Markov Strategies. The Bellman Equation in the Presence of Degeneracy.- 3. The Payoff Function and Solution of the Bellman Equation: The Uniqueness of the Solution of the Bellman Equation.- Notes.- 6 Controlled Processes with Unbounded Coefficients: The Normed Bellman Equation.- 1. Generalization of the Results Obtained in Section 3.1.- 2. General Methods for Estimating Derivatives of Payoff Functions.- 3. The Normed Bellman Equation.- 4. The Optimal Stopping of a Controlled Process on an Infinite Interval of Time.- 5. Control on an Infinite Interval of Time.- Notes.- Appendices.- 1. Some Properties of Stochastic Integrals.- 2. Some Properties of Submartingales.

by "Nielsen BookData"

Related Books: 1-1 of 1

Details

  • NCID
    BA06949506
  • ISBN
    • 0387904611
    • 3540904611
  • LCCN
    79026631
  • Country Code
    us
  • Title Language Code
    eng
  • Text Language Code
    eng
  • Original Language Code
    rus
  • Place of Publication
    New York
  • Pages/Volumes
    xii, 308 p.
  • Size
    25 cm
  • Classification
  • Subject Headings
  • Parent Bibliography ID
Page Top