arXiv Analytics

Sign in

arXiv:1908.01091 [cs.LG]AbstractReferencesReviewsResources

Toward Understanding Catastrophic Forgetting in Continual Learning

Cuong V. Nguyen, Alessandro Achille, Michael Lam, Tal Hassner, Vijay Mahadevan, Stefano Soatto

Published 2019-08-02Version 1

We study the relationship between catastrophic forgetting and properties of task sequences. In particular, given a sequence of tasks, we would like to understand which properties of this sequence influence the error rates of continual learning algorithms trained on the sequence. To this end, we propose a new procedure that makes use of recent developments in task space modeling as well as correlation analysis to specify and analyze the properties we are interested in. As an application, we apply our procedure to study two properties of a task sequence: (1) total complexity and (2) sequential heterogeneity. We show that error rates are strongly and positively correlated to a task sequence's total complexity for some state-of-the-art algorithms. We also show that, surprisingly, the error rates have no or even negative correlations in some cases to sequential heterogeneity. Our findings suggest directions for improving continual learning benchmarks and methods.

Related articles: Most relevant | Search more
arXiv:2007.07617 [cs.LG] (Published 2020-07-15)
SpaceNet: Make Free Space For Continual Learning
arXiv:1910.02718 [cs.LG] (Published 2019-10-07)
Continual Learning in Neural Networks
arXiv:1904.07734 [cs.LG] (Published 2019-04-15)
Three scenarios for continual learning