arXiv Analytics

Sign in

arXiv:1802.10419 [cs.CV]AbstractReferencesReviewsResources

Convolutional Neural Networks with Alternately Updated Clique

Yibo Yang, Zhisheng Zhong, Tiancheng Shen, Zhouchen Lin

Published 2018-02-28Version 1

Improving information flow in deep networks helps to ease the training difficulties and utilize parameters more efficiently. Here we propose a new convolutional neural network architecture with alternately updated clique (CliqueNet). In contrast to prior networks, there are both forward and backward connections between any two layers in the same block. The layers are constructed as a loop and are updated alternately. The CliqueNet has some unique properties. For each layer, it is both the input and output of any other layer in the same block, so that the information flow among layers is maximized. During propagation, the newly updated layers are concatenated to re-update previously updated layer, and parameters are reused for multiple times. This recurrent feedback structure is able to bring higher level visual information back to refine low-level filters and achieve spatial attention. We analyze the features generated at different stages and observe that using refined features leads to a better result. We adopt a multi-scale feature strategy that effectively avoids the progressive growth of parameters. Experiments on image recognition datasets including CIFAR-10, CIFAR-100, SVHN and ImageNet show that our proposed models achieve the state-of-the-art performance with fewer parameters.

Related articles: Most relevant | Search more
arXiv:1703.05593 [cs.CV] (Published 2017-03-16)
Convolutional neural network architecture for geometric matching
arXiv:1504.06266 [cs.CV] (Published 2015-04-23)
Evolving Fuzzy Image Segmentation with Self-Configuration
arXiv:2309.17281 [cs.CV] (Published 2023-09-29)
Information Flow in Self-Supervised Learning