Proposition Knowledge Graphs. Gabriel Stanovsky, Omer Levy, and Ido Dagan. AHA! Workshop 2014. [pdf] This position paper proposes a novel representation for Information Discovery — Proposition Knowledge Graphs. These extend the Open IE paradigm by representing semantic inter-proposition relations in a traversable graph. . . . . .

Advertisement

Focused Entailment Graphs for Open IE Propositions. Omer Levy, Ido Dagan, and Jacob Goldberger. CoNLL 2014. [pdf] [slides] Open IE methods extract structured propositions from text. However, these propositions are neither consolidated nor generalized, and querying them may lead to insufficient or redundant information. This work suggests an approach to organize open IE propositions using […]

Linguistic Regularities in Sparse and Explicit Word Representations. Omer Levy and Yoav Goldberg. CoNLL 2014. [pdf] [slides] This fascinating result raises a question: to what extent are the relational semantic properties a result of the embedding process? Experiments show that the RNN-based embeddings are superior to other dense representations, but how crucial is it for […]

Dependency-Based Word Embeddings. Omer Levy and Yoav Goldberg. Short paper in ACL 2014. [pdf] [slides] While continuous word embeddings are gaining popularity, current models are based solely on linear contexts. In this work, we generalize the skip-gram model with negative sampling introduced by Mikolov et al. to include arbitrary contexts. Code The code used in […]

word2vec Explained: Deriving Mikolov et al.’s Negative-Sampling Word-Embedding Method. Yoav Goldberg and Omer Levy. arXiv 2014. [pdf] The word2vec software of Tomas Mikolov and colleagues has gained a lot of traction lately, and provides state-of-the-art word embeddings. The learning models behind the software are described in two research papers. We found the description of the […]

The Excitement Open Platform for Textual Inferences. Bernardo Magnini, Roberto Zanoli, Ido Dagan, Kathrin Eichler, Günter Neumann, Tae-Gil Noh, Sebastian Padó, Asher Stern, and Omer Levy. Demo paper in ACL 2014. [pdf] This paper presents the Excitement Open Platform (EOP), a generic architecture and a comprehensive implementation for textual inference in multiple languages. Code The […]

Undivide and Conquer: On Selling a Divisible and Homogeneous Good. Omer Levy, Rann Smorodinsky, and Moshe Tennenholtz. BEJTE 2014. [pdf] We demonstrate how selling a divisible good as an indivisible one may increase seller revenues and characterize when this phenomenon occurs, and the corresponding gain factors. . . . . .

Recognizing Partial Textual Entailment. Omer Levy, Torsten Zesch, Ido Dagan, and Iryna Gurevych. Short paper in ACL 2013. [pdf] [slides] Textual entailment is an asymmetric relation between two text fragments that describes whether one fragment can be inferred from the other. It thus cannot capture the notion that the target fragment is “almost entailed” by the […]

UKP-BIU: Similarity and Entailment Metrics for Student Response Analysis. Torsten Zesch, Omer Levy, Iryna Gurevych, and Ido Dagan. SemEval 2013. [pdf] [slides] Given a question, a reference answer, and a student’s answer, the task is to determine whether the student answered correctly. While this is not a new task in itself, the challenge focuses on employing textual entailment technologies […]

Teaching Machines to Learn by Metaphors. Omer Levy and Shaul Markovitch. AAAI 2012. [pdf] [slides] Humans have an uncanny ability to learn new concepts with very few examples. Cognitive theories have suggested that this is done by utilizing prior experience of related tasks. We propose to emulate this process in machines, by transforming new problems into old ones. These transformations […]