Natural language processing with TensorFlow : teach language to machines using Python's deep learning library /: teach language to machines using Python's deep learning library. ([2018])
- Record Type:
- Book
- Title:
- Natural language processing with TensorFlow : teach language to machines using Python's deep learning library /: teach language to machines using Python's deep learning library. ([2018])
- Main Title:
- Natural language processing with TensorFlow : teach language to machines using Python's deep learning library
- Further Information:
- Note: Thushan Ganegedara.
- Authors:
- Ganegedara, Thushan
- Contents:
- Cover; Copyright; Packt Upsell; Contributors; Table of Contents; Preface; Chapter 1: Introduction to Natural Language Processing; What is Natural Language Processing?; Tasks of Natural Language Processing; The traditional approach to Natural Language Processing; Understanding the traditional approach; Example -- generating football game summaries; Drawbacks of the traditional approach; The deep learning approach to Natural Language Processing; History of deep learning; The current state of deep learning and NLP; Understanding a simple deep model -- a Fully Connected Neural Network The roadmap -- beyond this chapterIntroduction to the technical tools; Description of the tools; Installing Python and scikit-learn; Installing Jupyter Notebook; Installing TensorFlow; Summary; Chapter 2: Understanding TensorFlow; What is TensorFlow?; Getting started with TensorFlow; TensorFlow client in detail; TensorFlow architecture -- what happens when you execute the client?; Cafe Le TensorFlow -- understanding TensorFlow with an analogy; Inputs, variables, outputs, and operations; Defining inputs in TensorFlow; Feeding data with Python code; Preloading and storing data as tensors Building an input pipelineDefining variables in TensorFlow; Defining TensorFlow outputs; Defining TensorFlow operations; Comparison operations; Mathematical operations; Scatter and gather operations; Neural network-related operations; Reusing variables with scoping; Implementing our first neural network; PreparingCover; Copyright; Packt Upsell; Contributors; Table of Contents; Preface; Chapter 1: Introduction to Natural Language Processing; What is Natural Language Processing?; Tasks of Natural Language Processing; The traditional approach to Natural Language Processing; Understanding the traditional approach; Example -- generating football game summaries; Drawbacks of the traditional approach; The deep learning approach to Natural Language Processing; History of deep learning; The current state of deep learning and NLP; Understanding a simple deep model -- a Fully Connected Neural Network The roadmap -- beyond this chapterIntroduction to the technical tools; Description of the tools; Installing Python and scikit-learn; Installing Jupyter Notebook; Installing TensorFlow; Summary; Chapter 2: Understanding TensorFlow; What is TensorFlow?; Getting started with TensorFlow; TensorFlow client in detail; TensorFlow architecture -- what happens when you execute the client?; Cafe Le TensorFlow -- understanding TensorFlow with an analogy; Inputs, variables, outputs, and operations; Defining inputs in TensorFlow; Feeding data with Python code; Preloading and storing data as tensors Building an input pipelineDefining variables in TensorFlow; Defining TensorFlow outputs; Defining TensorFlow operations; Comparison operations; Mathematical operations; Scatter and gather operations; Neural network-related operations; Reusing variables with scoping; Implementing our first neural network; Preparing the data; Defining the TensorFlow graph; Running the neural network; Summary; Chapter 3: Word2vec -- Learning Word Embeddings; What is a word representation or meaning?; Classical approaches to learning word representation WordNet -- using an external lexical knowledge base for learning word representationsTour of WordNet; Problems with WordNet; One-hot encoded representation; The TF-IDF method; Co-occurrence matrix; Word2vec -- a neural network-based approach to learning word representation; Exercise: is queen = king -- he + she?; Designing a loss function for learning word embeddings; The skip-gram algorithm; From raw text to structured data; Learning the word embeddings with a neural network; Formulating a practical loss function; Efficiently approximating the loss function Implementing skip-gram with TensorFlowThe Continuous Bag-of-Words algorithm; Implementing CBOW in TensorFlow; Summary; Chapter 4: Advanced Word2vec; The original skip-gram algorithm; Implementing the original skip-gram algorithm; Comparing the original skip-gram with the improved skip-gram; Comparing skip-gram with CBOW; Performance comparison; Which is the winner, skip-gram or CBOW?; Extensions to the word embeddings algorithms; Using the unigram distribution for negative sampling; Implementing unigram-based negative sampling; Subsampling -- probabilistically ignoring the common words … (more)
- Publisher Details:
- Birmingham, UK : Packt
- Publication Date:
- 2018
- Copyright Date:
- 2018
- Extent:
- 1 online resource
- Subjects:
- 006.31
Computers -- Neural Networks
Machine learning
Artificial intelligence
Python (Computer program language)
COMPUTERS / General
Computers -- Programming Languages -- Python
Programming & scripting languages: general
Neural networks & fuzzy systems
Computers -- Intelligence (AI) & Semantics
Artificial intelligence
Electronic books - Languages:
- English
- ISBNs:
- 9781788477758
1788477758 - Notes:
- Note: Includes bibliographical references and index.
- Access Rights:
- Legal Deposit; Only available on premises controlled by the deposit library and to one user at any one time; The Legal Deposit Libraries (Non-Print Works) Regulations (UK).
- Access Usage:
- Restricted: Printing from this resource is governed by The Legal Deposit Libraries (Non-Print Works) Regulations (UK) and UK copyright law currently in force.
- View Content:
- Available online (eLD content is only available in our Reading Rooms) ↗
- Physical Locations:
- British Library HMNTS - ELD.DS.290875
- Ingest File:
- 01_204.xml