Calibrating a Deep Neural Network with Its Predecessors

Calibrating a Deep Neural Network with Its Predecessors

Linwei Tao, Minjing Dong, Daochang Liu, Changming Sun, Chang Xu

Proceedings of the Thirty-Second International Joint Conference on Artificial Intelligence
Main Track. Pages 4271-4279. https://doi.org/10.24963/ijcai.2023/475

Confidence calibration - the process to calibrate the output probability distribution of neural networks - is essential for safety-critical applications of such networks. Recent works verify the link between mis-calibration and overfitting. However, early stopping, as a well-known technique to mitigate overfitting, fails to calibrate networks. In this work, we study the limitions of early stopping and comprehensively analyze the overfitting problem of a network considering each individual block. We then propose a novel regularization method, predecessor combination search (PCS), to improve calibration by searching a combination of best-fitting block predecessors, where block predecessors are the corresponding network blocks with weight parameters from earlier training stages. PCS achieves the state-of-the-art calibration performance on multiple datasets and architectures. In addition, PCS improves model robustness under dataset distribution shift. Supplementary material and code are available at https://github.com/Linwei94/PCS
Keywords:
Machine Learning: ML: Classification