Learning from Logical Constraints with Lower- and Upper-Bound Arithmetic Circuits

Learning from Logical Constraints with Lower- and Upper-Bound Arithmetic Circuits

Lucile Dierckx, Alexandre Dubray, Siegfried Nijssen

Proceedings of the Thirty-Fourth International Joint Conference on Artificial Intelligence
Main Track. Pages 5012-5020. https://doi.org/10.24963/ijcai.2025/558

An important class of neuro-symbolic (NeSy) methods relies on knowledge compilation (KC) techniques to transform logical constraints into a differentiable exact arithmetic circuit (AC) that represents all models of a logical formula. However, given the complexity of KC, compiling such exact circuits can be infeasible. Previous works in such cases proposed to compile a circuit for a subset of models. In this work, we will show that gradients calculated on a subset of models can be very far from true gradients. We propose a new framework that calculates gradients based on compiling logical constraints partially in not only a lower-bound circuit but also an upper-bound circuit. We prove that from this pair of ACs, gradients that are within a bounded distance from true gradients can be calculated. Our experiments show that adding the upper-bound AC also helps the learning process in practice, allowing for similar or better generalisation than working solely with fully compiled ACs, even with less than 150 seconds of partial compilation.
Keywords:
Machine Learning: ML: Neuro-symbolic methods/Abductive Learning
Uncertainty in AI: UAI: Probabilistic programming
Knowledge Representation and Reasoning: KRR: Learning and reasoning
Knowledge Representation and Reasoning: KRR: Knowledge compilation