Higher-order Logical Knowledge Representation Learning

Higher-order Logical Knowledge Representation Learning

Suixue Wang, Weiliang Huo, Shilin Zhang, Qingchen Zhang

Proceedings of the Thirty-Fourth International Joint Conference on Artificial Intelligence
Main Track. Pages 3398-3406. https://doi.org/10.24963/ijcai.2025/378

Real-world knowledge graphs abound with higher-order logical relations that simple triples, limited to pairwise connections, fail to represent. Thus, capturing higher-order logical relations involving multiple entities has garnered significant attention. However, existing methods ignore the structural information in higher-order relations. To this end, we propose a higher-order logical knowledge representation learning method, named LORE, which leverages network motifs, the patterns/subgraphs that naturally capture the structural information in graphs, to extract higher-order features and ultimately, learn effective representations of knowledge graphs. Compared to existing approaches, LORE aggregates the attribute features of entities with the extracted higher-order logical relations to form enhanced representations of knowledge graphs. In particular, three aggregators (i.e., Hadamard, Connection, and Summation) are proposed and employed. Extensive experiments have been conducted on six real-world datasets for two downstream tasks (i.e., entity classification and link prediction). The results show that LORE outperforms baselines significantly and consistently.
Keywords:
Data Mining: DM: Knowledge graphs and knowledge base completion
Computer Vision: CV: Representation learning
Knowledge Representation and Reasoning: KRR: Knowledge representation languages
Machine Learning: ML: Representation learning