Tree-Based Convolutional Neural Networks : Principles and Applications /: Principles and Applications. (2018)
- Record Type:
- Book
- Title:
- Tree-Based Convolutional Neural Networks : Principles and Applications /: Principles and Applications. (2018)
- Main Title:
- Tree-Based Convolutional Neural Networks : Principles and Applications
- Further Information:
- Note: Lili Mou and Zhi Jin.
- Authors:
- Mou, Lili
Jin, Zhi, 1962- - Contents:
- Intro; Preface; Acknowledgements; Contents; Acronyms; 1 Introduction; 1.1 Deep Learning Background; 1.2 Incorporating Structural Information into Neural Architectures; 1.3 The Proposed Tree-Based Convolutional Neural Networks; 1.4 Structure of the Book; References; 2 Background and Related Work; 2.1 Generic Neural Networks; 2.1.1 Neuron and Multilayer Network; 2.1.2 Training Objectives; 2.1.3 Learning Neural Parameters; 2.1.4 Pretraining Neural Networks; 2.2 Neural Networks for Natural Language Processing; 2.2.1 Specialty of Natural Language Processing; 2.2.2 Neural Language Models 2.2.3 Word Embeddings2.3 Structure-Sensitive Neural Networks; 2.3.1 Convolutional Neural Network; 2.3.2 Recurrent Neural Network; 2.3.3 Recursive Neural Network; 2.4 Summary; References; 3 General Framework of Tree-Based Convolutional Neural Networks (TBCNNs); 3.1 General Idea and Formula of TBCNN; 3.2 Applications of TBCNN; 3.3 Difficulties in Designing TBCNN; 4 TBCNN for Programs' Abstract Syntax Trees; 4.1 Introduction; 4.2 The Proposed Approach; 4.2.1 Overview; 4.2.2 Representation Learning for AST Nodes; 4.2.3 Coding Layer; 4.2.4 Tree-Based Convolutional Layer; 4.2.5 Dynamic Pooling 4.2.6 The ``Continuous Binary Tree'' Model4.3 Experiments; 4.3.1 Unsupervised Program Vector Representations; 4.3.2 Classifying Programs by Functionalities; 4.3.3 Detecting Bubble Sort; 4.3.4 Model Analysis; 4.4 Summary and Discussion; References; 5 TBCNN for Constituency Trees in Natural Language Processing; 5.1Intro; Preface; Acknowledgements; Contents; Acronyms; 1 Introduction; 1.1 Deep Learning Background; 1.2 Incorporating Structural Information into Neural Architectures; 1.3 The Proposed Tree-Based Convolutional Neural Networks; 1.4 Structure of the Book; References; 2 Background and Related Work; 2.1 Generic Neural Networks; 2.1.1 Neuron and Multilayer Network; 2.1.2 Training Objectives; 2.1.3 Learning Neural Parameters; 2.1.4 Pretraining Neural Networks; 2.2 Neural Networks for Natural Language Processing; 2.2.1 Specialty of Natural Language Processing; 2.2.2 Neural Language Models 2.2.3 Word Embeddings2.3 Structure-Sensitive Neural Networks; 2.3.1 Convolutional Neural Network; 2.3.2 Recurrent Neural Network; 2.3.3 Recursive Neural Network; 2.4 Summary; References; 3 General Framework of Tree-Based Convolutional Neural Networks (TBCNNs); 3.1 General Idea and Formula of TBCNN; 3.2 Applications of TBCNN; 3.3 Difficulties in Designing TBCNN; 4 TBCNN for Programs' Abstract Syntax Trees; 4.1 Introduction; 4.2 The Proposed Approach; 4.2.1 Overview; 4.2.2 Representation Learning for AST Nodes; 4.2.3 Coding Layer; 4.2.4 Tree-Based Convolutional Layer; 4.2.5 Dynamic Pooling 4.2.6 The ``Continuous Binary Tree'' Model4.3 Experiments; 4.3.1 Unsupervised Program Vector Representations; 4.3.2 Classifying Programs by Functionalities; 4.3.3 Detecting Bubble Sort; 4.3.4 Model Analysis; 4.4 Summary and Discussion; References; 5 TBCNN for Constituency Trees in Natural Language Processing; 5.1 Background of Sentence Modeling and Constituency Trees; 5.2 Proposed Model; 5.2.1 Constituency Trees as Input; 5.2.2 Recursively Representing Intermediate Nodes; 5.2.3 Constituency Tree-Based Convolutional Layer; 5.2.4 Dynamic Pooling Layer; 5.3 Experiments; 5.3.1 Sentiment Analysis 5.3.2 Question Classification5.4 Summary and Discussions; References; 6 TBCNN for Dependency Trees in Natural Language Processing; 6.1 Background of Dependency Trees; 6.2 Proposed Model; 6.2.1 Dependency Trees as Input; 6.2.2 Convolutional Layer; 6.2.3 Dynamic Pooling Layer; 6.2.4 Applying d-TBCNN to Sentence Matching; 6.3 Experiments; 6.3.1 Discriminative Sentence Modeling; 6.3.2 Sentence Matching; 6.3.3 Model Analysis; 6.3.4 Visualization; 6.4 Conclusion and Discussion; References; 7 Conclusion and Future Work; 7.1 Conclusion; 7.2 Future Work; References; Index … (more)
- Publisher Details:
- Singapore : Springer Nature Springer
- Publication Date:
- 2018
- Extent:
- 1 online resource
- Subjects:
- 006.32
Computer science
Neural networks (Computer science)
Machine learning
Artificial intelligence -- Data processing
COMPUTERS / General
Computers -- Database Management -- Data Mining
Computers -- Intelligence (AI) & Semantics
Computers -- Software Development & Engineering -- General
Data mining
Artificial intelligence
Software Engineering
Artificial intelligence
Data mining
Engineering
Software engineering
Electronic books - Languages:
- English
- ISBNs:
- 9789811318702
9811318700 - Related ISBNs:
- 9789811318696
- Notes:
- Note: Online resource; title from PDF title page (EBSCO, viewed October 4, 2018).
- Access Rights:
- Legal Deposit; Only available on premises controlled by the deposit library and to one user at any one time; The Legal Deposit Libraries (Non-Print Works) Regulations (UK).
- Access Usage:
- Restricted: Printing from this resource is governed by The Legal Deposit Libraries (Non-Print Works) Regulations (UK) and UK copyright law currently in force.
- View Content:
- Available online (eLD content is only available in our Reading Rooms) ↗
- Physical Locations:
- British Library HMNTS - ELD.DS.334183
- Ingest File:
- 01_279.xml