Attention as activation

Publikation: Bidrag til bog/antologi/rapportKonferencebidrag i proceedingsForskningfagfællebedømt

Activation functions and attention mechanisms are typically treated as having different purposes and have evolved differently. However, both concepts can be formulated as a nonlinear gating function. Inspired by their similarity, we propose a novel type of activation units called attentional activation (ATAC) units as a unification of activation functions and attention mechanisms. In particular, we propose a local channel attention module for the simultaneous non-linear activation and element-wise feature refinement, which locally aggregates point-wise cross-channel feature contexts. By replacing the well-known rectified linear units by such ATAC units in convolutional networks, we can construct fully attentional networks that perform significantly better with a modest number of additional parameters. We conducted detailed ablation studies on the ATAC units using several host networks with varying network depths to empirically verify the effectiveness and efficiency of the units. Furthermore, we compared the performance of the ATAC units against existing activation functions as well as other attention mechanisms on the CIFAR-10, CIFAR-100, and ImageNet datasets. Our experimental results show that networks constructed with the proposed ATAC units generally yield performance gains over their competitors given a comparable number of parameters.

OriginalsprogEngelsk
TitelProceedings of ICPR 2020 - 25th International Conference on Pattern Recognition
Antal sider6
ForlagIEEE
Publikationsdato2020
Sider4131-4136
Artikelnummer9413020
ISBN (Elektronisk)9781728188089
DOI
StatusUdgivet - 2020
Begivenhed25th International Conference on Pattern Recognition, ICPR 2020 - Virtual, Milan, Italien
Varighed: 10 jan. 202115 jan. 2021

Konference

Konference25th International Conference on Pattern Recognition, ICPR 2020
LandItalien
ByVirtual, Milan
Periode10/01/202115/01/2021

ID: 286998008