Updating Probability Intervals with Uncertain Inputs

Updating Probability Intervals with Uncertain Inputs

Karim Tabia

Proceedings of the Thirty-First International Joint Conference on Artificial Intelligence
Main Track. Pages 2748-2754. https://doi.org/10.24963/ijcai.2022/381

Probability intervals provide an intuitive, powerful and unifying setting for encoding and reasoning with imprecise beliefs. This paper addresses the problem of updating uncertain information specified in the form of probability intervals with new uncertain inputs also expressed as probability intervals. We place ourselves in the framework of Jeffrey's rule of conditioning and propose extensions of this conditioning for the interval-based setting. More precisely, we first extend Jeffrey's rule to credal sets then propose extensions of Jeffrey's rule to three common conditioning rules for probability intervals (robust, Dempster and geometric conditionings). While the first extension is based on conditioning the extreme points of the credal sets induced by the probability intervals, the other methods directly revise the interval bounds of the distributions to be updated. Finally, the paper discusses related issues and relates the proposed methods with respect to the state-of-the-art.
Keywords:
Knowledge Representation and Reasoning: Belief Change
Knowledge Representation and Reasoning: Reasoning about Knowledge and Belief
Uncertainty in AI: Uncertainty Representations