SmartBERT: A Promotion of Dynamic Early Exiting Mechanism for Accelerating BERT Inference

SmartBERT: A Promotion of Dynamic Early Exiting Mechanism for Accelerating BERT Inference

Boren Hu, Yun Zhu, Jiacheng Li, Siliang Tang

Proceedings of the Thirty-Second International Joint Conference on Artificial Intelligence
Main Track. Pages 5067-5075. https://doi.org/10.24963/ijcai.2023/563

Dynamic early exiting has been proven to improve the inference speed of the pre-trained language model like BERT. However, all samples must go through all consecutive layers before early exiting and more complex samples usually go through more layers, which still exists redundant computation. In this paper, we propose a novel dynamic early exiting combined with layer skipping for BERT inference named SmartBERT, which adds a skipping gate and an exiting operator into each layer of BERT. SmartBERT can adaptively skip some layers and adaptively choose whether to exit. Besides, we propose cross-layer contrastive learning and combine it into our training phases to boost the intermediate layers and classifiers which would be beneficial for early exiting. To keep the inconsistent usage of skipping gates between training and inference phases, we propose a hard weight mechanism during training phase. We conduct experiments on eight classification datasets of the GLUE benchmark. Experimental results show that SmartBERT achieves 2-3× computation reduction with minimal accuracy drops compared with BERT and our method outperforms previous methods in both efficiency and accuracy. Moreover, in some complex datasets, we prove that the early exiting based on entropy hardly works, and the skipping mechanism is essential for reducing computation.
Keywords:
Natural Language Processing: NLP: Language models
Natural Language Processing: NLP: Text classification