Feature Staleness Aware Incremental Learning for CTR Prediction
Feature Staleness Aware Incremental Learning for CTR Prediction
Zhikai Wang, Yanyan Shen, Zibin Zhang, Kangyi Lin
Proceedings of the Thirty-Second International Joint Conference on Artificial Intelligence
Main Track. Pages 2352-2360.
https://doi.org/10.24963/ijcai.2023/261
Click-through Rate (CTR) prediction in real-world recommender systems often deals with billions of user interactions every day. To improve the training efficiency, it is common to update the CTR prediction model incrementally using the new incremental data and a subset of historical data. However, the feature embeddings of a CTR prediction model often get stale when the corresponding features do not appear in current incremental data. In the next period, the model would have a performance degradation on samples containing stale features, which we call the feature staleness problem. To mitigate this problem, we propose a Feature Staleness Aware Incremental Learning method for CTR prediction (FeSAIL) which adaptively replays samples containing stale features. We first introduce a staleness-aware sampling algorithm (SAS) to sample a fixed number of stale samples with high sampling efficiency. We then introduce a staleness-aware regularization mechanism (SAR) for a fine-grained control of the feature embedding updating. We instantiate FeSAIL with a general deep learning-based CTR prediction model and the experimental results demonstrate FeSAIL outperforms various state-of-the-art methods on four benchmark datasets. The code can be found in https://github.com/cloudcatcher888/FeSAIL.
Keywords:
Data Mining: DM: Recommender systems
Machine Learning: ML: Incremental learning