Graph theory and sparse matrix computation
Author(s)
Bibliographic Information
Graph theory and sparse matrix computation
(The IMA volumes in mathematics and its applications, v. 56)
Springer-Verlag, [2013]
- : pbk
Available at 2 libraries
  Aomori
  Iwate
  Miyagi
  Akita
  Yamagata
  Fukushima
  Ibaraki
  Tochigi
  Gunma
  Saitama
  Chiba
  Tokyo
  Kanagawa
  Niigata
  Toyama
  Ishikawa
  Fukui
  Yamanashi
  Nagano
  Gifu
  Shizuoka
  Aichi
  Mie
  Shiga
  Kyoto
  Osaka
  Hyogo
  Nara
  Wakayama
  Tottori
  Shimane
  Okayama
  Hiroshima
  Yamaguchi
  Tokushima
  Kagawa
  Ehime
  Kochi
  Fukuoka
  Saga
  Nagasaki
  Kumamoto
  Oita
  Miyazaki
  Kagoshima
  Okinawa
  Korea
  China
  Thailand
  United Kingdom
  Germany
  Switzerland
  France
  Belgium
  Netherlands
  Sweden
  Norway
  United States of America
Note
"Softcover reprint of the hardcover 1st edition 1993"--T.p verso
Includes bibliographical references
Description and Table of Contents
Description
This IMA Volume in Mathematics and its Appllcations GRAPH THEORY AND SPARSE MATRIX COMPUTATION is based on the proceedings of a workshop that was an integraI part of the 1991- 92 IMA program on "Applied Linear AIgebra." The purpose of the workshop was to bring together people who work in sparse matrix computation with those who conduct research in applied graph theory and grl:l,ph algorithms, in order to foster active cross-fertilization. We are grateful to Richard Brualdi, George Cybenko, Alan Geo~ge, Gene Golub, Mitchell Luskin, and Paul Van Dooren for planning and implementing the year-Iong program. We espeeially thank Alan George, John R. Gilbert, and Joseph W.H. Liu for organizing this workshop and editing the proceedings. The finaneial support of the National Science Foundation made the workshop possible. A vner Friedman Willard Miller. Jr. PREFACE When reality is modeled by computation, linear algebra is often the con nec- tiori between the continuous physical world and the finite algorithmic one. Usually, the more detailed the model, the bigger the matrix, the better the answer.
Efficiency demands that every possible advantage be exploited: sparse structure, advanced com- puter architectures, efficient algorithms. Therefore sparse matrix computation knits together threads from linear algebra, parallei computing, data struetures, geometry, and both numerieal and discrete algorithms.
Table of Contents
An introduction to chordal graphs and clique trees.- Cutting down on fill using nested dissection: Provably good elimination orderings.- Automatic Mesh Partitioning.- Structural representations of Schur complements in sparse matrices.- Irreducibility and primitivity of Perron complements: Application of the compressed directed graph.- Predicting structure in nonsymmetric sparse matrix factorizations.- Highly parallel sparse triangular solution.- The fan-both family of column-based distributed Cholesky factorization algorithms.- Scalability of sparse direct solvers.- Sparse matrix factorization on SIMD parallel computers.- The efficient parallel iterative solution of large sparse linear systems.
by "Nielsen BookData"