arXiv Analytics

Sign in

arXiv:1811.03436 [cs.LG]AbstractReferencesReviewsResources

Alpha-Pooling for Convolutional Neural Networks

Hayoung Eom, Heeyoul Choi

Published 2018-11-08Version 1

Convolutional neural networks (CNNs) have achieved remarkable performance in many applications, especially image recognition. As a crucial component of CNNs, sub-sampling plays an important role, and max pooling and arithmetic average pooling are commonly used sub-sampling methods. In addition to the two pooling methods, however, there could be many other pooling types, such as geometric average, harmonic average, and so on. Since it is not easy for algorithms to find the best pooling method, human experts choose types of pooling, which might not be optimal for different tasks. Following deep learning philosophy, the type of pooling can be driven by data for a given task. In this paper, we propose {\em alpha-pooling}, which has a trainable parameter $\alpha$ to decide the type of pooling. Alpha-pooling is a general pooling method including max pooling and arithmetic average pooling as a special case, depending on the parameter $\alpha$. In experiments, alpha-pooling improves the accuracy of image recognition tasks, and we found that max pooling is not the optimal pooling scheme. Moreover each layer has different optimal pooling types.

Comments: 5 pages, submitted to ICASSP 2019
Categories: cs.LG, stat.ML
Related articles: Most relevant | Search more
arXiv:1810.13098 [cs.LG] (Published 2018-10-31)
Low-Rank Embedding of Kernels in Convolutional Neural Networks under Random Shuffling
arXiv:1812.04439 [cs.LG] (Published 2018-12-11)
Synergy Effect between Convolutional Neural Networks and the Multiplicity of SMILES for Improvement of Molecular Prediction
arXiv:1905.08094 [cs.LG] (Published 2019-05-17)
Be Your Own Teacher: Improve the Performance of Convolutional Neural Networks via Self Distillation