Cognitively Inspired Learning of Incremental Drifting Concepts
Cognitively Inspired Learning of Incremental Drifting Concepts
Mohammad Rostami, Aram Galstyan
Proceedings of the Thirty-Second International Joint Conference on Artificial Intelligence
Main Track. Pages 3058-3066.
https://doi.org/10.24963/ijcai.2023/341
Humans continually expand their learned knowledge to new domains and learn new concepts without any interference with past learned experiences. In contrast, machine learning models perform poorly in a continual learning setting, where input data distribution changes over time. Inspired by the nervous system learning mechanisms, we develop a computational model that enables a deep neural network to learn new concepts and expand its learned knowledge to new domains incrementally in a continual learning setting. We rely on the Parallel Distributed Processing theory to encode abstract concepts in an embedding space in terms of a multimodal distribution. This embedding space is modeled by internal data representations in a hidden network layer. We also leverage the Complementary Learning Systems theory to equip the model with a memory mechanism to overcome catastrophic forgetting through implementing pseudo-rehearsal. Our model can generate pseudo-data points for experience replay and accumulate new experiences to past learned experiences without causing cross-task interference.
Keywords:
Humans and AI: HAI: Cognitive modeling
Humans and AI: HAI: Brain sciences
Humans and AI: HAI: Cognitive systems