Deeper Connections between Neural Networks and Gaussian Processes Speed-up Active Learning

Deeper Connections between Neural Networks and Gaussian Processes Speed-up Active Learning

Evgenii Tsymbalov, Sergei Makarychev, Alexander Shapeev, Maxim Panov

Proceedings of the Twenty-Eighth International Joint Conference on Artificial Intelligence
Main track. Pages 3599-3605. https://doi.org/10.24963/ijcai.2019/499

Active learning methods for neural networks are usually based on greedy criteria, which ultimately give a single new design point for the evaluation. Such an approach requires either some heuristics to sample a batch of design points at one active learning iteration, or retraining the neural network after adding each data point, which is computationally inefficient. Moreover, uncertainty estimates for neural networks sometimes are overconfident for the points lying far from the training sample. In this work, we propose to approximate Bayesian neural networks (BNN) by Gaussian processes (GP), which allows us to update the uncertainty estimates of predictions efficiently without retraining the neural network while avoiding overconfident uncertainty prediction for out-of-sample points. In a series of experiments on real-world data, including large-scale problems of chemical and physical modeling, we show the superiority of the proposed approach over the state-of-the-art methods.
Keywords:
Machine Learning: Active Learning
Uncertainty in AI: Uncertainty in AI
Uncertainty in AI: Approximate Probabilistic Inference
Uncertainty in AI: Bayesian Networks
Machine Learning: Deep Learning