arXiv Analytics

Sign in

arXiv:1802.08250 [cs.LG]AbstractReferencesReviewsResources

Overcoming Catastrophic Forgetting in Convolutional Neural Networks by Selective Network Augmentation

Abel S. Zacarias, Luís A. Alexandre

Published 2018-02-22Version 1

Lifelong learning aims to develop machine learning systems that can learn new tasks while preserving the performance on previous tasks. This approach can be applied, for example, to prevent accident on autonomous vehicles by applying the knowledge learned on previous situations. In this paper we present a method to overcomes catastrophic forgetting that learns new tasks and preserves the performance on old tasks without accessing the data of the original model, by selective network augmentation, using convolutional neural networks for image classification. The experiment results showed that our method, in some scenarios outperforms the state-of-art Learning without Forgetting algorithm. Results also showed that in some situations is better to use our model instead of training a neural network using isolated learning.

Related articles: Most relevant | Search more
arXiv:1711.10284 [cs.LG] (Published 2017-11-28)
Between-class Learning for Image Classification
arXiv:1812.04439 [cs.LG] (Published 2018-12-11)
Synergy Effect between Convolutional Neural Networks and the Multiplicity of SMILES for Improvement of Molecular Prediction
arXiv:1811.10746 [cs.LG] (Published 2018-11-26)
MATCH-Net: Dynamic Prediction in Survival Analysis using Convolutional Neural Networks