Conjugate gradient algorithms in nonconvex optimization

Bibliographic Information

Conjugate gradient algorithms in nonconvex optimization

Radosław Pytlak

(Nonconvex optimization and its applications, v. 89)

Springer, c2009

Available at  / 7 libraries

Search this Book/Journal

Note

Formerly CIP

Includes bibliographical references and index

Description and Table of Contents

Description

Conjugate direction methods were proposed in the early 1950s. When high speed digital computing machines were developed, attempts were made to lay the fo- dations for the mathematical aspects of computations which could take advantage of the ef?ciency of digital computers. The National Bureau of Standards sponsored the Institute for Numerical Analysis, which was established at the University of California in Los Angeles. A seminar held there on numerical methods for linear equationswasattendedbyMagnusHestenes, EduardStiefel andCorneliusLanczos. This led to the ?rst communication between Lanczos and Hestenes (researchers of the NBS) and Stiefel (of the ETH in Zurich) on the conjugate direction algorithm. The method is attributed to Hestenes and Stiefel who published their joint paper in 1952 [101] in which they presented both the method of conjugate gradient and the conjugate direction methods including conjugate Gram-Schmidt processes. A closelyrelatedalgorithmwasproposedbyLanczos[114]whoworkedonalgorithms for determiningeigenvalues of a matrix. His iterative algorithm yields the similarity transformation of a matrix into the tridiagonal form from which eigenvalues can be well approximated.Thethree-termrecurrencerelationofthe Lanczosprocedurecan be obtained by eliminating a vector from the conjugate direction algorithm scheme. Initially the conjugate gradient algorithm was called the Hestenes-Stiefel-Lanczos method [86].

Table of Contents

Conjugate Direction Methods for Quadratic Problems.- Conjugate Gradient Methods for Nonconvex Problems.- Memoryless Quasi-Newton Methods.- Preconditioned Conjugate Gradient Algorithms.- Limited Memory Quasi-Newton Algorithms.- The Method of Shortest Residuals and Nondifferentiable Optimization.- The Method of Shortest Residuals for Differentiable Problems.- The Preconditioned Shortest Residuals Algorithm.- Optimization on a Polyhedron.- Conjugate Gradient Algorithms for Problems with Box Constraints.- Preconditioned Conjugate Gradient Algorithms for Problems with Box Constraints.- Preconditioned Conjugate Gradient Based Reduced-Hessian Methods.

by "Nielsen BookData"

Related Books: 1-1 of 1

Details

Page Top