Introduction to linear algebra : computation, application, and theory

著者

    • DeBonis, Mark J.

書誌事項

Introduction to linear algebra : computation, application, and theory

Mark J. DeBonis

(Textbooks in mathematics)

CRC Press, 2022

  • : hbk

大学図書館所蔵 件 / 1

この図書・雑誌をさがす

注記

Includes bibliographical references (p. 415) and index

内容説明・目次

内容説明

Features Includes cutting edge applications in machine learning and data analytics. Suitable as a primary text for undergraduates studying linear algebra. Requires very little in the way of pre-requisites.

目次

1. Examples of Vector Spaces. 1.1. First Vector Space: Tuples. 1.2. Dot Product. 1.3. Application: Geometry. 1.4. Second Vector Space: Matrices. 1.5. Matrix Multiplication. 2. Matrices and Linear Systems. 2.1. Systems of Linear Equations. 2.2. Gaussian Elimination. 2.3. Application: Markov Chains. 2.4. Application: The Simplex Method. 2.5. Elementary Matrices and Matrix Equivalence. 2.6. Inverse of a Matrix. 2.7. Application: The Simplex Method Revisited. 2.8. Homogeneous/Nonhomogeneous Systems and Rank. 2.9. Determinant. 2.10. Applications of the Determinant. 2.11. Application: Lu Factorization. 3. Vector Spaces. 3.1. Definition and Examples. 3.2. Subspace. 3.3. Linear Independence. 3.4. Span. 3.5. Basis and Dimension. 3.6. Subspaces Associated with a Matrix. 3.7. Application: Dimension Theorems. 4. Linear Transformations. 4.1. Definition and Examples. 4.2. Kernel and Image. 4.3. Matrix Representation. 4.4. Inverse and Isomorphism. 4.5. Similarity of Matrices. 4.6. Eigenvalues and Diagonalization. 4.7. Axiomatic Determinant. 4.8. Quotient Vector Space. 4.9. Dual Vector Space. 5. Inner Product Spaces. 5.1. Definition, Examples and Properties. 5.2. Orthogonal and Orthonormal. 5.3. Orthogonal Matrices. 5.4. Application: QR Factorization. 5.5. Schur Triangularization Theorem. 5.6. Orthogonal Projections and Best Approximation. 5.7. Real Symmetric Matrices. 5.8. Singular Value Decomposition. 5.9. Application: Least Squares Optimization. 6. Applications in Data Analytics. 6.1. Introduction. 6.2. Direction of Maximal Spread. 6.3. Principal Component Analysis. 6.4. Dimensionality Reduction. 6.5. Mahalanobis Distance. 6.6. Data Sphering. 6.7. Fisher Linear Discriminant Function. 6.8. Linear Discriminant Functions in Feature Space. 6.9. Minimal Square Error Linear Discriminant Function. 7. Quadratic Forms. 7.1. Introduction to Quadratic Forms. 7.2. Principal Minor Criterion. 7.3. Eigenvalue Criterion. 7.4. Application: Unconstrained Nonlinear Optimization. 7.5. General Quadratic Forms. Appendix A. Regular Matrices. Appendix B. Rotations and Reflections in Two Dimensions. Appendix C. Answers to Selected Exercises.

「Nielsen BookData」 より

関連文献: 1件中  1-1を表示

詳細情報

  • NII書誌ID(NCID)
    BD00439950
  • ISBN
    • 9781032108988
  • LCCN
    2021042588
  • 出版国コード
    us
  • タイトル言語コード
    eng
  • 本文言語コード
    eng
  • 出版地
    Boca Raton
  • ページ数/冊数
    xiii, 420 p.
  • 大きさ
    27 cm
  • 分類
  • 件名
  • 親書誌ID
ページトップへ