{ "id": "2101.07295", "version": "v1", "published": "2021-01-18T19:29:12.000Z", "updated": "2021-01-18T19:29:12.000Z", "title": "Does Continual Learning = Catastrophic Forgetting?", "authors": [ "Anh Thai", "Stefan Stojanov", "Isaac Rehg", "James M. Rehg" ], "categories": [ "cs.LG", "cs.CV" ], "abstract": "Continual learning is known for suffering from catastrophic forgetting, a phenomenon where earlier learned concepts are forgotten at the expense of more recent samples. In this work, we challenge the assumption that continual learning is inevitably associated with catastrophic forgetting by presenting a set of tasks that surprisingly do not suffer from catastrophic forgetting when learned continually. We attempt to provide an insight into the property of these tasks that make them robust to catastrophic forgetting and the potential of having a proxy representation learning task for continual classification. We further introduce a novel yet simple algorithm, YASS that outperforms state-of-the-art methods in the class-incremental categorization learning task. Finally, we present DyRT, a novel tool for tracking the dynamics of representation learning in continual models. The codebase, dataset and pre-trained models released with this article can be found at https://github.com/ngailapdi/CLRec.", "revisions": [ { "version": "v1", "updated": "2021-01-18T19:29:12.000Z" } ], "analyses": { "keywords": [ "catastrophic forgetting", "continual learning", "outperforms state-of-the-art methods", "class-incremental categorization learning task", "proxy representation learning task" ], "note": { "typesetting": "TeX", "pages": 0, "language": "en", "license": "arXiv", "status": "editable" } } }