Universal pooling – A new pooling method for convolutional neural networks

Junhyuk Hyun, Hongje Seong, Euntai Kim

Research output: Contribution to journalArticlepeer-review

7 Citations (Scopus)


Pooling is one of the key elements in a convolutional neural network. It reduces the feature map size, thereby enabling training with a limited amount of computation. The most common pooling methods are average pooling, max pooling, and stride pooling. The common pooling methods, however, have the disadvantage that they can perform only specified and fixed pooling functions and thus have limited expressive power. In this paper, we propose a new pooling method named universal pooling (UP). UP performs different pooling functions depending on the training samples. UP is a general pooling and includes the previous common pooling methods as special cases. The structure of UP is inspired by attention methods. UP can actually be considered as a channel-wise local spatial attention module. It is quite different from attention-based feature reduction methods. We insert UP into a couple of popular networks and apply the networks to benchmark sets in two applications, namely, image recognition and semantic segmentation. The experiment results show that complex poolings are trained in the proposed UP and that UP achieves better performance than the previous pooling methods.

Original languageEnglish
Article number115084
JournalExpert Systems with Applications
Publication statusPublished - 2021 Oct 15

Bibliographical note

Funding Information:
This research was supported by the National Research Foundation of Korea (NRF) grant funded by the Korea government (MSIT) ( NRF-2019R1A2C1007153 ).

Publisher Copyright:
© 2021 Elsevier Ltd

All Science Journal Classification (ASJC) codes

  • Engineering(all)
  • Computer Science Applications
  • Artificial Intelligence


Dive into the research topics of 'Universal pooling – A new pooling method for convolutional neural networks'. Together they form a unique fingerprint.

Cite this