/ResAttNet

Repository of Team Lynx for ResCon 2021

Primary LanguagePython

ResCon - Team Lynx

Residual Attention Network for Image Classification

Project Status: Inactive – The project has reached a stable, usable state but is no longer being actively developed; support/maintenance will be provided as time allows.

RESCON

Interactions between the Feature and Attention Masks of the Residual Attetntion Network [Image Referenced from the Paper]

Basic Model

We have used the PyTorch-Lightning Framework to implement a Residual Attention Network which can be used for Image Classification. A Residual Attention Network is nothing but a, "Convolutional Neural Network using Attention Mechanism which can implement State-of-art feed forward network architecture", as mentioned in the abstract of the paper.

You can run the notebook for testing our code.

For training the model, you have to run train_pl.py

Observations

Dataset Used Architecture implemented (Attention Type) Optimiser Used Image size Training Loss Test Loss
CIFAR-100 Attention-92 SGD 32 1.26 1.58
CIFAR-10 Attention-92 SGD 32 0.51 0.53
CIFAR-100 Attention-56 SGD 224 1.42 1.80
CIFAR-10 Attention-56 SGD 224 0.61 0.65
CIFAR-100 Attention-92 SGD 224 2.95 2.90
CIFAR-10 Attention-92 SGD 224 1.12 1.01

Some of the models' saved checkpoints can be downloaded from the drive

Further Improvements

We were able to implement only a few Res-Net architectures and that too only on 2 datasets because of the computional power and time required to run the model on our machines. Areas we are looking to improve on and work in the future

  • Implementing Attention-56 Architecture
  • Implementing Attention-92 Architecture
  • Implementing Attention-128, Attention-156 Architecture
  • Implementing the Paper using other Deep Learning Frameworks like Tensorflow

Paper implemented

Residual Attention Network for Image Classification By Fei Wang, Mengqing Jiang, Chen Qian, Shuo Yang, Chen Li, Honggang Zhang, Xiaogang Wang, Xiaoou Tang

References

ResidualAttentionNetwork-pytorch (GitHub)
Residual Attention Network (GitHub)

Citations

@inproceedings{wang2017residual,
  title={Residual attention network for image classification},
  author={Wang, Fei and Jiang, Mengqing and Qian, Chen and Yang, Shuo and Li, Cheng and Zhang, Honggang and Wang, Xiaogang and Tang, Xiaoou},
  booktitle={Proceedings of the IEEE conference on computer vision and pattern recognition},
  pages={3156--3164},
  year={2017}
}

Contributors


Harshit Aggarwal


Kunal Mundada


Pranav B Kashyap