/ResNeSt

PyTorch implementation of ResNeSt : Split-Attention Networks

Primary LanguagePythonApache License 2.0Apache-2.0

ResNeSt

PyTorch implementation of ResNeSt : Split-Attention Networks [1].

This implementation is only for my understanding of the architecture of ResNeSt.
Mostly the radix-major implementation of the bottleneck block.

The official implementation

Requirements

  • docker
  • docker-compose

Model

  • Only supports dilation=1.

ToDo

  • Evaluate the model

Reference

[1] ResNeSt : Split-Attention Networks, Hang Zhang, Chongruo Wu, Zhongyue Zhang, Yi Zhu, Zhi Zhang, Haibin Lin, Yue Sun, Tong He, Jonas Mueller, R. Manmatha, Mu Li, Alexander Smola, https://arxiv.org/abs/2004.08955

Author

Sawada Tomoya