Classification with Rejection: Scaling Generative Classifiers with Supervised Deep Infomax

Classification with Rejection: Scaling Generative Classifiers with Supervised Deep Infomax

Xin Wang, Siu Ming Yiu

Proceedings of the Twenty-Ninth International Joint Conference on Artificial Intelligence
Main track. Pages 2980-2986. https://doi.org/10.24963/ijcai.2020/412

Deep Infomax (DIM) is an unsupervised representation learning framework by maximizing the mutual information between the inputs and the outputs of an encoder, while probabilistic constraints are imposed on the outputs. In this paper, we propose Supervised Deep InfoMax (SDIM), which introduces supervised probabilistic constraints to the encoder outputs. The supervised probabilistic constraints are equivalent to a generative classifier on high-level data representations, where class conditional log-likelihoods of samples can be evaluated. Unlike other works building generative classifiers with conditional generative models, SDIMs scale on complex datasets, and can achieve comparable performance with discriminative counterparts. With SDIM, we could perform classification with rejection. Instead of always reporting a class label, SDIM only makes predictions when test samples' largest class conditional surpass some pre-chosen thresholds, otherwise they will be deemed as out of the data distributions, and be rejected. Our experiments show that SDIM with rejection policy can effectively reject illegal inputs, including adversarial examples and out-of-distribution samples.
Keywords:
Machine Learning: Classification
Multidisciplinary Topics and Applications: Security and Privacy
Machine Learning: Learning Generative Models