Python natural language processing :: explore NLP with machine learning and deep learning techniques /

Chapter 6: Advanced Feature Engineering and NLP Algorithms -- Recall word embedding -- Understanding the basics of word2vec -- Distributional semantics -- Defining word2vec -- Necessity of unsupervised distribution semantic model - word2vec -- Challenges -- Converting the word2vec model from black b...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
1. Verfasser: Thanaki, Jalaj (VerfasserIn)
Format: Elektronisch E-Book
Sprache:English
Veröffentlicht: Birmingham, UK : Packt Publishing, 2017.
Schlagworte:
Online-Zugang:Volltext
Zusammenfassung:Chapter 6: Advanced Feature Engineering and NLP Algorithms -- Recall word embedding -- Understanding the basics of word2vec -- Distributional semantics -- Defining word2vec -- Necessity of unsupervised distribution semantic model - word2vec -- Challenges -- Converting the word2vec model from black box to white box -- Distributional similarity based representation -- Understanding the components of the word2vec model -- Input of the word2vec -- Output of word2vec -- Construction components of the word2vec model -- Architectural component -- Understanding the logic of the word2vec model -- Vocabulary builder -- Context builder -- Neural network with two layers -- Structural details of a word2vec neural network -- Word2vec neural network layer's details -- Softmax function -- Main processing algorithms -- Continuous bag of words -- Skip-gram -- Understanding algorithmic techniques and the mathematics behind the word2vec model -- Understanding the basic mathematics for the word2vec algorithm -- Techniques used at the vocabulary building stage -- Lossy counting -- Using it at the stage of vocabulary building -- Applications -- Techniques used at the context building stage -- Dynamic window scaling -- Understanding dynamic context window techniques -- Subsampling -- Pruning -- Algorithms used by neural networks -- Structure of the neurons -- Basic neuron structure -- Training a simple neuron -- Define error function -- Understanding gradient descent in word2vec -- Single neuron application -- Multi-layer neural networks -- Backpropagation -- Mathematics behind the word2vec model -- Techniques used to generate final vectors and probability prediction stage -- Hierarchical softmax -- Negative sampling -- Some of the facts related to word2vec -- Applications of word2vec -- Implementation of simple examples -- Famous example (king - man + woman).
Beschreibung:1 online resource (1 volume) : illustrations
ISBN:9781787285521
1787285529
9781523112173
1523112174

Es ist kein Print-Exemplar vorhanden.

Volltext öffnen