arXiv Analytics

Sign in

arXiv:1707.07103 [cs.CV]AbstractReferencesReviewsResources

PatchShuffle Regularization

Guoliang Kang, Xuanyi Dong, Liang Zheng, Yi Yang

Published 2017-07-22Version 1

This paper focuses on regularizing the training of the convolutional neural network (CNN). We propose a new regularization approach named ``PatchShuffle`` that can be adopted in any classification-oriented CNN models. It is easy to implement: in each mini-batch, images or feature maps are randomly chosen to undergo a transformation such that pixels within each local patch are shuffled. Through generating images and feature maps with interior orderless patches, PatchShuffle creates rich local variations, reduces the risk of network overfitting, and can be viewed as a beneficial supplement to various kinds of training regularization techniques, such as weight decay, model ensemble and dropout. Experiments on four representative classification datasets show that PatchShuffle improves the generalization ability of CNN especially when the data is scarce. Moreover, we empirically illustrate that CNN models trained with PatchShuffle are more robust to noise and local changes in an image.

Related articles: Most relevant | Search more
arXiv:1901.10415 [cs.CV] (Published 2019-01-29)
MgNet: A Unified Framework of Multigrid and Convolutional Neural Network
arXiv:2103.15425 [cs.CV] (Published 2021-03-29)
FocusedDropout for Convolutional Neural Network
arXiv:1412.8341 [cs.CV] (Published 2014-12-29)
Spectral classification using convolutional neural networks