Learning Conceptual Space Representations of Interrelated Concepts
Learning Conceptual Space Representations of Interrelated Concepts
Zied Bouraoui, Steven Schockaert
Proceedings of the Twenty-Seventh International Joint Conference on Artificial Intelligence
Main track. Pages 1760-1766.
https://doi.org/10.24963/ijcai.2018/243
Several recently proposed methods aim to learn conceptual space representations from large text collections. These learned representations associate each object from a given domain of interest with a point in a high-dimensional Euclidean space, but they do not model the concepts from this domain, and can thus not directly be used for categorization and related cognitive tasks. A natural solution is to represent concepts as Gaussians, learned from the representations of their instances, but this can only be reliably done if sufficiently many instances are given, which is often not the case. In this paper, we introduce a Bayesian model which addresses this problem by constructing informative priors from background knowledge about how the concepts of interest are interrelated with each other. We show that this leads to substantially better predictions in a knowledge base completion task.
Keywords:
Knowledge Representation and Reasoning: Common-Sense Reasoning
Knowledge Representation and Reasoning: Geometric, Spatial, and Temporal Reasoning
Humans and AI: Cognitive Modeling
Natural Language Processing: Embeddings