arXiv Analytics

Sign in

arXiv:2101.07295 [cs.LG]AbstractReferencesReviewsResources

Does Continual Learning = Catastrophic Forgetting?

Anh Thai, Stefan Stojanov, Isaac Rehg, James M. Rehg

Published 2021-01-18Version 1

Continual learning is known for suffering from catastrophic forgetting, a phenomenon where earlier learned concepts are forgotten at the expense of more recent samples. In this work, we challenge the assumption that continual learning is inevitably associated with catastrophic forgetting by presenting a set of tasks that surprisingly do not suffer from catastrophic forgetting when learned continually. We attempt to provide an insight into the property of these tasks that make them robust to catastrophic forgetting and the potential of having a proxy representation learning task for continual classification. We further introduce a novel yet simple algorithm, YASS that outperforms state-of-the-art methods in the class-incremental categorization learning task. Finally, we present DyRT, a novel tool for tracking the dynamics of representation learning in continual models. The codebase, dataset and pre-trained models released with this article can be found at https://github.com/ngailapdi/CLRec.

Related articles: Most relevant | Search more
arXiv:2403.05175 [cs.LG] (Published 2024-03-08)
Continual Learning and Catastrophic Forgetting
arXiv:1811.11682 [cs.LG] (Published 2018-11-28)
Experience Replay for Continual Learning
arXiv:2108.12641 [cs.LG] (Published 2021-08-28)
Prototypes-Guided Memory Replay for Continual Learning