/SEM

SEM can automatically decide to select and integrate attention operators to compute attention maps.

Primary LanguagePythonMIT LicenseMIT

SEM: Switchable Excitation Module for Self-attention Mechanism

996.ICU GitHub GitHub

This repository is the implementation of "SEM: Switchable Excitation Module for Self-attention Mechanism" [paper] on CIFAR-100 and CIFAR-10 datasets.

Introduction

SEM is a self-attention module, which can automatically decide to select and integrate attention operators to compute attention maps.

Requirement

Python and PyTorch.

pip install -r requirements.txt

Usage

python run.py --dataset cifar100 --block-name bottleneck --depth 164 --epochs 164 --schedule 81 122 --gamma 0.1 --wd 1e-4

Results

Dataset original SEM
ResNet164 CIFAR10 93.39 94.95
ResNet164 CIFAR100 74.30 76.76

Citing SEM

@article{zhong2022switchable,
  title={Switchable Self-attention Module},
  author={Zhong, Shanshan and Wen, Wushao and Qin, Jinghui},
  journal={arXiv preprint arXiv:2209.05680},
  year={2022}
}

Acknowledgments

Many thanks to bearpaw for his simple and clean Pytorch framework for image classification task.