Linear algebra and optimization for machine learning a textbook /: a textbook. (2020)
- Record Type:
- Book
- Title:
- Linear algebra and optimization for machine learning a textbook /: a textbook. (2020)
- Main Title:
- Linear algebra and optimization for machine learning a textbook
- Further Information:
- Note: Charu C. Aggarwal.
- Other Names:
- Aggarwal, Charu C
- Contents:
- Intro -- Contents -- Preface -- Acknowledgments -- Author Biography -- 1 Linear Algebra and Optimization: An Introduction -- 1.1 Introduction -- 1.2 Scalars, Vectors, and Matrices -- 1.2.1 Basic Operations with Scalars and Vectors -- 1.2.2 Basic Operations with Vectors and Matrices -- 1.2.3 Special Classes of Matrices -- 1.2.4 Matrix Powers, Polynomials, and the Inverse -- 1.2.5 The Matrix Inversion Lemma: Inverting the Sum of Matrices -- 1.2.6 Frobenius Norm, Trace, and Energy -- 1.3 Matrix Multiplication as a Decomposable Operator 1.3.1 Matrix Multiplication as Decomposable Row and ColumnOperators -- 1.3.2 Matrix Multiplication as Decomposable Geometric Operators -- 1.4 Basic Problems in Machine Learning -- 1.4.1 Matrix Factorization -- 1.4.2 Clustering -- 1.4.3 Classification and Regression Modeling -- 1.4.4 Outlier Detection -- 1.5 Optimization for Machine Learning -- 1.5.1 The Taylor Expansion for Function Simplification -- 1.5.2 Example of Optimization in Machine Learning -- 1.5.3 Optimization in Computational Graphs -- 1.6 Summary -- 1.7 Further Reading -- 1.8 Exercises -- 2 Linear Transformations and Linear Systems 2.1 Introduction -- 2.1.1 What Is a Linear Transform? -- 2.2 The Geometry of Matrix Multiplication -- 2.3 Vector Spaces and Their Geometry -- 2.3.1 Coordinates in a Basis System -- 2.3.2 Coordinate Transformations Between Basis Sets -- 2.3.3 Span of a Set of Vectors -- 2.3.4 Machine Learning Example: Discrete Wavelet Transform -- 2.3.5 Relationships AmongIntro -- Contents -- Preface -- Acknowledgments -- Author Biography -- 1 Linear Algebra and Optimization: An Introduction -- 1.1 Introduction -- 1.2 Scalars, Vectors, and Matrices -- 1.2.1 Basic Operations with Scalars and Vectors -- 1.2.2 Basic Operations with Vectors and Matrices -- 1.2.3 Special Classes of Matrices -- 1.2.4 Matrix Powers, Polynomials, and the Inverse -- 1.2.5 The Matrix Inversion Lemma: Inverting the Sum of Matrices -- 1.2.6 Frobenius Norm, Trace, and Energy -- 1.3 Matrix Multiplication as a Decomposable Operator 1.3.1 Matrix Multiplication as Decomposable Row and ColumnOperators -- 1.3.2 Matrix Multiplication as Decomposable Geometric Operators -- 1.4 Basic Problems in Machine Learning -- 1.4.1 Matrix Factorization -- 1.4.2 Clustering -- 1.4.3 Classification and Regression Modeling -- 1.4.4 Outlier Detection -- 1.5 Optimization for Machine Learning -- 1.5.1 The Taylor Expansion for Function Simplification -- 1.5.2 Example of Optimization in Machine Learning -- 1.5.3 Optimization in Computational Graphs -- 1.6 Summary -- 1.7 Further Reading -- 1.8 Exercises -- 2 Linear Transformations and Linear Systems 2.1 Introduction -- 2.1.1 What Is a Linear Transform? -- 2.2 The Geometry of Matrix Multiplication -- 2.3 Vector Spaces and Their Geometry -- 2.3.1 Coordinates in a Basis System -- 2.3.2 Coordinate Transformations Between Basis Sets -- 2.3.3 Span of a Set of Vectors -- 2.3.4 Machine Learning Example: Discrete Wavelet Transform -- 2.3.5 Relationships Among Subspaces of a Vector Space -- 2.4 The Linear Algebra of Matrix Rows and Columns -- 2.5 The Row Echelon Form of a Matrix -- 2.5.1 LU Decomposition -- 2.5.2 Application: Finding a Basis Set -- 2.5.3 Application: Matrix Inversion 2.5.4 Application: Solving a System of Linear Equations -- 2.6 The Notion of Matrix Rank -- 2.6.1 Effect of Matrix Operations on Rank -- 2.7 Generating Orthogonal Basis Sets -- 2.7.1 Gram-Schmidt Orthogonalization and QR Decomposition -- 2.7.2 QR Decomposition -- 2.7.3 The Discrete Cosine Transform -- 2.8 An Optimization-Centric View of Linear Systems -- 2.8.1 Moore-Penrose Pseudoinverse -- 2.8.2 The Projection Matrix -- 2.9 Ill-Conditioned Matrices and Systems -- 2.10 Inner Products: A Geometric View -- 2.11 Complex Vector Spaces -- 2.11.1 The Discrete Fourier Transform -- 2.12 Summary 2.13 Further Reading -- 2.14 Exercises -- 3 Eigenvectors and Diagonalizable Matrices -- 3.1 Introduction -- 3.2 Determinants -- 3.3 Diagonalizable Transformations and Eigenvectors -- 3.3.1 Complex Eigenvalues -- 3.3.2 Left Eigenvectors and Right Eigenvectors -- 3.3.3 Existence and Uniqueness of Diagonalization -- 3.3.4 Existence and Uniqueness of Triangulization -- 3.3.5 Similar Matrix Families Sharing Eigenvalues -- 3.3.6 Diagonalizable Matrix Families Sharing Eigenvectors -- 3.3.7 Symmetric Matrices -- 3.3.8 Positive Semidefinite Matrices … (more)
- Publisher Details:
- Cham : Springer
- Publication Date:
- 2020
- Extent:
- 1 online resource (507 p.)
- Subjects:
- 512/.5
Algebras, Linear
Machine learning -- Mathematics
Electronic books - Languages:
- English
- ISBNs:
- 9783030403447
3030403440 - Related ISBNs:
- 9783030403430
3030403432 - Notes:
- Note: Includes bibliographical references and index.
- Access Rights:
- Legal Deposit; Only available on premises controlled by the deposit library and to one user at any one time; The Legal Deposit Libraries (Non-Print Works) Regulations (UK).
- Access Usage:
- Restricted: Printing from this resource is governed by The Legal Deposit Libraries (Non-Print Works) Regulations (UK) and UK copyright law currently in force.
- View Content:
- Available online (eLD content is only available in our Reading Rooms) ↗
- Physical Locations:
- British Library HMNTS - ELD.DS.507265
- Ingest File:
- 03_082.xml