Mathematical theories of machine learning -- theory and applications. (2020)
- Record Type:
- Book
- Title:
- Mathematical theories of machine learning -- theory and applications. (2020)
- Main Title:
- Mathematical theories of machine learning -- theory and applications
- Further Information:
- Note: Bin Shi, S.S. Iyengar.
- Other Names:
- Shi, Bin
Iyengar, S. S (Sundararaja S.) - Contents:
- 9.2.4 Bounding u and r in Randomized Models 3.1.3 Application to Sparse Subspace Clustering3.2 OnlineAlgorithms: SequentialUpdatingforMachineLearning; 3.2.1 Application to Multivariate Time Series (MTS); 3.3 Concluding Remarks; 4 Development of Novel Techniques of CoCoSSC Method; 4.1 Research Questions; 4.2 Accelerated Gradient Descent; 4.3 The CoCoSSC Method; 4.3.1 Advantages of CoCoSSC; 4.4 Online Time-Varying Elastic-Net Algorithm; 4.5 Concluding Remarks; 5 Necessary Notations of the Proposed Method; 5.1 Concluding Remarks; 6 Related Work on Geometry of Non-Convex Programs; 6.1 Multivariate Time-Series Data Sets 6.2 Particle Learning6.3 Application to Climate Change; 6.4 Concluding Remarks; Part II Mathematical Framework for Machine Learning: Theoretical Part; 7 Gradient Descent Converges to Minimizers: Optimal and Adaptive Step-Size Rules; 7.1 Introduction; 7.1.1 Related Works; 7.2 Notations and Preliminaries; 7.3 Maximum Allowable Step Size; 7.3.1 Consequences of Theorem 7.1; 7.3.2 Optimality of Theorem 7.1; 7.4 Adaptive Step-Size Rules; 7.5 Proof of Theorem 7.1; 7.6 Proof of Theorem 7.2; 7.6.1 Hartman Product Map Theorem; 7.6.2 Complete Proof of Theorem 7.2; 7.7 Additional Theorems 7.8 Technical Proofs7.9 Conclusions; 8 A Conservation Law Method Based on Optimization; 8.1 Warm-up: An Analytical Demonstration for Intuition; 8.2 Symplectic Scheme and Algorithms; 8.2.1 Artificially Dissipating Energy Algorithm; A Simple Example for Illustration; 8.2.2 Detecting Local9.2.4 Bounding u and r in Randomized Models 3.1.3 Application to Sparse Subspace Clustering3.2 OnlineAlgorithms: SequentialUpdatingforMachineLearning; 3.2.1 Application to Multivariate Time Series (MTS); 3.3 Concluding Remarks; 4 Development of Novel Techniques of CoCoSSC Method; 4.1 Research Questions; 4.2 Accelerated Gradient Descent; 4.3 The CoCoSSC Method; 4.3.1 Advantages of CoCoSSC; 4.4 Online Time-Varying Elastic-Net Algorithm; 4.5 Concluding Remarks; 5 Necessary Notations of the Proposed Method; 5.1 Concluding Remarks; 6 Related Work on Geometry of Non-Convex Programs; 6.1 Multivariate Time-Series Data Sets 6.2 Particle Learning6.3 Application to Climate Change; 6.4 Concluding Remarks; Part II Mathematical Framework for Machine Learning: Theoretical Part; 7 Gradient Descent Converges to Minimizers: Optimal and Adaptive Step-Size Rules; 7.1 Introduction; 7.1.1 Related Works; 7.2 Notations and Preliminaries; 7.3 Maximum Allowable Step Size; 7.3.1 Consequences of Theorem 7.1; 7.3.2 Optimality of Theorem 7.1; 7.4 Adaptive Step-Size Rules; 7.5 Proof of Theorem 7.1; 7.6 Proof of Theorem 7.2; 7.6.1 Hartman Product Map Theorem; 7.6.2 Complete Proof of Theorem 7.2; 7.7 Additional Theorems 7.8 Technical Proofs7.9 Conclusions; 8 A Conservation Law Method Based on Optimization; 8.1 Warm-up: An Analytical Demonstration for Intuition; 8.2 Symplectic Scheme and Algorithms; 8.2.1 Artificially Dissipating Energy Algorithm; A Simple Example for Illustration; 8.2.2 Detecting Local Minima Using Energy Conservation Algorithm; The Simple Example for Illustration; 8.2.3 Combined Algorithm; 8.3 An Asymptotic Analysis for the Phenomena of Local High-Speed Convergence; 8.3.1 Some Lemmas for the Linearized Scheme; 8.3.2 The Asymptotic Analysis; 8.4 Experimental Demonstration 8.4.1 Strongly Convex Function8.4.2 Non-Strongly Convex Function; 8.4.3 Non-Convex Function; 8.5 Conclusion and Further Works; Part III Mathematical Framework for Machine Learning: Application Part; 9 Improved Sample Complexity in Sparse Subspace Clustering with Noisy and Missing Observations; 9.1 Main Results About CoCoSSC Algorithm; 9.1.1 The Non-Uniform Semi-Random Model; 9.1.2 The Fully Random Model; 9.2 Proofs; 9.2.1 Noise Characterization and Feasibilityof Pre-Processing; 9.2.2 Optimality Condition and Dual Certificates; 9.2.3 Deterministic Success Conditions Intro; Foreword; Preface; Acknowledgments; Contents; Author Biographies; Part I Introduction; 1 Introduction; 1.1 Neural Networks; 1.1.1 Learning Process That Is Iterative; 1.2 Deep Learning; 1.3 Gradient Descent; 1.3.1 Batch Gradient Descent; 1.3.2 Stochastic Gradient Descent; 1.3.3 Mini-Batch Gradient Descent; 1.4 Summary; 1.5 Organization of the Research Monograph; 2 General Framework of Mathematics; 2.1 Concluding Remarks; 3 Optimization Formulation; 3.1 Optimization Techniques Needed for Machine Learning; 3.1.1 Gradient Descent; 3.1.2 Accelerated Gradient Descent … (more)
- Publisher Details:
- Cham : Springer
- Publication Date:
- 2020
- Extent:
- 1 online resource (138 pages)
- Subjects:
- 006.3/10151
Machine learning -- Mathematics
Electronic books - Languages:
- English
- ISBNs:
- 9783030170769
3030170764 - Related ISBNs:
- 3030170756
9783030170752 - Notes:
- Note: Print version record.
- Access Rights:
- Legal Deposit; Only available on premises controlled by the deposit library and to one user at any one time; The Legal Deposit Libraries (Non-Print Works) Regulations (UK).
- Access Usage:
- Restricted: Printing from this resource is governed by The Legal Deposit Libraries (Non-Print Works) Regulations (UK) and UK copyright law currently in force.
- View Content:
- Available online (eLD content is only available in our Reading Rooms) ↗
- Physical Locations:
- British Library HMNTS - ELD.DS.432624
- Ingest File:
- 02_550.xml