Learning Mixtures of Random Utility Models with Features from Incomplete Preferences

Learning Mixtures of Random Utility Models with Features from Incomplete Preferences

Zhibing Zhao, Ao Liu, Lirong Xia

Proceedings of the Thirty-First International Joint Conference on Artificial Intelligence
Main Track. Pages 3780-3786. https://doi.org/10.24963/ijcai.2022/525

Random Utility Models (RUMs), which subsume Plackett-Luce model (PL) as a special case, are among the most popular models for preference learning. In this paper, we consider RUMs with features and their mixtures, where each alternative has a vector of features, possibly different across agents. Such models significantly generalize the standard PL and RUMs, but are not as well investigated in the literature. We extend mixtures of RUMs with features to models that generate incomplete preferences and characterize their identifiability. For PL, we prove that when PL with features is identifiable, its MLE is consistent with a strictly concave objective function under mild assumptions, by characterizing a bound on root-mean-square-error (RMSE), which naturally leads to a sample complexity bound. We also characterize identifiability of more general RUMs with features and propose a generalized RBCML to learn them. Our experiments on synthetic data demonstrate the effectiveness of MLE on PL with features with tradeoffs between statistical efficiency and computational efficiency. Our experiments on real-world data show the prediction power of PL with features and its mixtures.
Keywords:
Machine Learning: Learning Preferences or Rankings