Deep learning architectures : a mathematical approach /: a mathematical approach. (2020)
- Record Type:
- Book
- Title:
- Deep learning architectures : a mathematical approach /: a mathematical approach. (2020)
- Main Title:
- Deep learning architectures : a mathematical approach
- Further Information:
- Note: Ovidiu Calin.
- Other Names:
- Calin, Ovidiu
- Contents:
- Intro -- Foreword -- Overview -- Part I -- Part II -- Part III -- Part IV -- Part V -- Bibliographical Remarks -- Chapters Diagram -- Notations and Symbols -- Calculus -- Linear Algebra -- Probability Theory -- Measure Theory -- Information Theory -- Differential Geometry -- Neural Networks -- Contents -- Part I Introduction to Neural Networks -- 1 Introductory Problems -- 1.1 Water in a Sink -- 1.2 An Electronic Circuit -- 1.3 The Eight Rooks Problem -- 1.4 Biological Neuron -- 1.5 Linear Regression -- 1.6 The Cocktail Factory Network -- 1.7 An Electronic Network -- 1.8 Summary 1.9 Exercises -- 2 Activation Functions -- 2.1 Examples of Activation Functions -- 2.2 Sigmoidal Functions -- 2.3 Squashing Functions -- 2.4 Summary -- 2.5 Exercises -- 3 Cost Functions -- 3.1 Input, Output, and Target -- 3.2 The Supremum Error Function -- 3.3 The L2-Error Function -- 3.4 Mean Square Error Function -- 3.5 Cross-entropy -- 3.6 Kullback-Leibler Divergence -- 3.7 Jensen-Shannon Divergence -- 3.8 Maximum Mean Discrepancy -- 3.9 Other Cost Functions -- 3.10 Sample Estimation of Cost Functions -- 3.11 Cost Functions and Regularization -- 3.12 Training and Test Errors 3.13 Geometric Significance -- 3.14 Summary -- 3.15 Exercises -- 4 Finding Minima Algorithms -- 4.1 General Properties of Minima -- 4.1.1 Functions of a real variable -- 4.1.2 Functions of several real variables -- 4.2 Gradient Descent Algorithm -- 4.2.1 Level sets -- 4.2.2 Directional derivative -- 4.2.3 Method of SteepestIntro -- Foreword -- Overview -- Part I -- Part II -- Part III -- Part IV -- Part V -- Bibliographical Remarks -- Chapters Diagram -- Notations and Symbols -- Calculus -- Linear Algebra -- Probability Theory -- Measure Theory -- Information Theory -- Differential Geometry -- Neural Networks -- Contents -- Part I Introduction to Neural Networks -- 1 Introductory Problems -- 1.1 Water in a Sink -- 1.2 An Electronic Circuit -- 1.3 The Eight Rooks Problem -- 1.4 Biological Neuron -- 1.5 Linear Regression -- 1.6 The Cocktail Factory Network -- 1.7 An Electronic Network -- 1.8 Summary 1.9 Exercises -- 2 Activation Functions -- 2.1 Examples of Activation Functions -- 2.2 Sigmoidal Functions -- 2.3 Squashing Functions -- 2.4 Summary -- 2.5 Exercises -- 3 Cost Functions -- 3.1 Input, Output, and Target -- 3.2 The Supremum Error Function -- 3.3 The L2-Error Function -- 3.4 Mean Square Error Function -- 3.5 Cross-entropy -- 3.6 Kullback-Leibler Divergence -- 3.7 Jensen-Shannon Divergence -- 3.8 Maximum Mean Discrepancy -- 3.9 Other Cost Functions -- 3.10 Sample Estimation of Cost Functions -- 3.11 Cost Functions and Regularization -- 3.12 Training and Test Errors 3.13 Geometric Significance -- 3.14 Summary -- 3.15 Exercises -- 4 Finding Minima Algorithms -- 4.1 General Properties of Minima -- 4.1.1 Functions of a real variable -- 4.1.2 Functions of several real variables -- 4.2 Gradient Descent Algorithm -- 4.2.1 Level sets -- 4.2.2 Directional derivative -- 4.2.3 Method of Steepest Descent -- 4.2.4 Line Search Method -- 4.3 Kinematic Interpretation -- 4.4 Momentum Method -- 4.4.1 Kinematic Interpretation -- 4.4.2 Convergence conditions -- 4.5 AdaGrad -- 4.6 RMSProp -- 4.7 Adam -- 4.8 AdaMax -- 4.9 Simulated Annealing Method 4.9.1 Kinematic Approach for SA -- 4.9.2 Thermodynamic Interpretation for SA -- 4.10 Increasing Resolution Method -- 4.11 Hessian Method -- 4.12 Newton's Method -- 4.13 Stochastic Search -- 4.13.1 Deterministic variant -- 4.13.2 Stochastic variant -- 4.14 Neighborhood Search -- 4.14.1 Left and Right Search -- 4.14.2 Circular Search -- 4.14.3 Stochastic Spherical Search -- 4.14.4 From Local to Global -- 4.15 Continuous Learning -- 4.16 Summary -- 4.17 Exercises -- 5 Abstract Neurons -- 5.1 Definition and Properties -- 5.2 Perceptron Model -- 5.3 The Sigmoid Neuron -- 5.4 Logistic Regression 5.4.1 Default probability of a company -- 5.4.2 Binary Classifier -- 5.4.3 Learning with the square difference cost function -- 5.5 Linear Neuron -- 5.6 Adaline -- 5.7 Madaline -- 5.8 Continuum Input Neuron -- 5.9 Summary -- 5.10 Exercises -- 6 Neural Networks -- 6.1 An Example of Neural Network -- 6.1.1 Total variation and regularization -- 6.1.2 Backpropagation -- 6.2 General Neural Networks -- 6.2.1 Forward pass through the network -- 6.2.2 Going backwards through the network -- 6.2.3 Backpropagation of deltas -- 6.2.4 Concluding relations -- 6.2.5 Matrix form … (more)
- Publisher Details:
- Cham : Springer
- Publication Date:
- 2020
- Extent:
- 1 online resource (768 pages)
- Subjects:
- 006.3/101/51
Machine learning -- Mathematics
Electronic books
Electronic books - Languages:
- English
- ISBNs:
- 9783030367213
3030367215
9783030367220
3030367223
9783030367237
3030367231 - Related ISBNs:
- 9783030367206
3030367207 - Notes:
- Note: Includes bibliographical references and index.
Note: Print version record. - Access Rights:
- Legal Deposit; Only available on premises controlled by the deposit library and to one user at any one time; The Legal Deposit Libraries (Non-Print Works) Regulations (UK).
- Access Usage:
- Restricted: Printing from this resource is governed by The Legal Deposit Libraries (Non-Print Works) Regulations (UK) and UK copyright law currently in force.
- View Content:
- Available online (eLD content is only available in our Reading Rooms) ↗
- Physical Locations:
- British Library HMNTS - ELD.DS.491451
- Ingest File:
- 03_053.xml