Abstract

Proceedings Abstracts of the Twenty-Fourth International Joint Conference on Artificial Intelligence

Embedding Semantic Relations into Word Representations / 1222
Danushka Bollegala, Takanori Maehara, Ken-ichi Kawarabayashi
PDF

Learning representations for semantic relations is important for various tasks such as analogy detection, relational search, and relation classification.Although there have been several proposals for learning representations for individual words,learning word representations that explicitly capture the semantic relations between words remains under developed.We propose an unsupervised method for learning vector representations for words such that the learnt representations are sensitive to the semantic relations that exist between two words.First, we extract lexical patterns from the co-occurrence contexts of two words in a corpus to represent the semantic relations that exist between those two words.Second, we represent a lexical pattern as the weighted sum of the representations of the words that co-occur with that lexical pattern. Third, we train a binary classifier to detect relationally similar versus non-similar lexical pattern pairs.The proposed method is unsupervised in the sense that the lexical pattern pairs we use as train data are automatically sampled from a corpus, without requiring any manual intervention.Our proposed method statistically significantly outperforms the current state-of-the-art word representations on three benchmark datasets for proportional analogy detection, demonstrating its ability to accurately capture the semantic relations among words.