Category Word Embeddings

Dependency-Based Word Embeddings

Dependency-Based Word Embeddings. Omer Levy and Yoav Goldberg. Short paper in ACL 2014. [pdf] [slides] While continuous word embeddings are gaining popularity, current models are based solely on linear contexts. In this work, we generalize the skip-gram model with negative sampling introduced by Mikolov et al. to include arbitrary contexts. Code The code used in […]

word2vec Explained: Deriving Mikolov et al.’s Negative-Sampling Word-Embedding Method

word2vec Explained: Deriving Mikolov et al.’s Negative-Sampling Word-Embedding Method. Yoav Goldberg and Omer Levy. arXiv 2014. [pdf] The word2vec software of Tomas Mikolov and colleagues has gained a lot of traction lately, and provides state-of-the-art word embeddings. The learning models behind the software are described in two research papers. We found the description of the […]