Constrained Sequential Inference in Machine Learning Using Constraint Programming
Constrained Sequential Inference in Machine Learning Using Constraint Programming
Virasone Manibod, David Saikali, Gilles Pesant
Proceedings of the Thirty-Fourth International Joint Conference on Artificial Intelligence
Main Track. Pages 2628-2636.
https://doi.org/10.24963/ijcai.2025/293
Sequence models in machine learning often struggle to exhibit long-term structure. We consider this problem at inference time in the context of enforcing constraints that are not necessarily featured in the dataset on which the generative model was trained. The difficulty lies in imposing previously-unseen structure while staying close to the training dataset. It is particularly hard for long-term structure, which requires balancing foresight over many yet-to-be generated tokens and the immediacy of next-token predictions from the sequence model. We address this problem by introducing our neurosymbolic framework GeAI-BLAnC. The learned probabilities of the sequence model are mixed in with the marginal probabilities computed from a constraint programming / belief propagation framework applied to a constraint programming model expressing the desired structure.
The next predicted token is then selected from the resulting probability distribution. Experiments in the context of molecule and music generation show that we can achieve the structure imposed post-training without straying too much from the structure of the dataset learned during training.
Keywords:
Constraint Satisfaction and Optimization: CSO: Constraint programming
Machine Learning: ML: Generative models
Machine Learning: ML: Neuro-symbolic methods/Abductive Learning
Machine Learning: ML: Structured prediction
