Goto

Collaborating Authors

 autola




Auto Learning Attention Benteng Ma

Neural Information Processing Systems

Attention modules have been demonstrated effective in strengthening the representation ability of a neural network via reweighting spatial or channel features or stacking both operations sequentially. However, designing the structures of different attention operations requires a bulk of computation and extensive expertise.


Fair comparison and ablation study

Neural Information Processing Systems

The results on CIFAR10 were listed in Table R1. It reveals that HOGA searched by AutoLA (k=4)) still outperforms SE and CBAM by a large margin. We further customized SE and CBAM using the group split operation (denoted by "HOG"), resulting in a specific The HOGA searched by AutoLA outperforms its randomly search counterparts (denoted by "Rand"). We tested the generalization ability of HOGA searched on ResNet56 (denoted by "AutoLA_56") WiderResNet, indicating the consistent superiority of the HOGA searched by AutoLA over previous attention methods. We also compared AutoLA with SE and CBAM on a larger backbone (e.g., The results in Table R3 suggest that AutoLA still outperforms other attention modules.