including GhostNet, TinyNet, TNT (Transformer in Transformer) developed by Huawei Noah's Ark Lab.
News
2021/06/15 The code of TNT (Transformer in Transformer) has been released in this repo.
2020/11/10 The code of TinyNet (NeurIPS 2020) has been release at MindSpore Model Zoo.
2020/10/31 GhostNet+TinyNet achieves better performance. See details in our NeurIPS 2020 paper: arXiv.
2020/09/24 We release GhostNet models for more vision tasks on MindSpore Hub and MindSpore Model Zoo.
2020/06/10 GhostNet is included in PyTorch Hub.
This repo provides pretrained models and inference code for TensorFlow and PyTorch:
- Tensorflow: ./ghostnet_tensorflow with pretrained model.
- PyTorch: ./ghostnet_pytorch with pretrained model.
- We also opensource code on MindSpore Hub and MindSpore Model Zoo.
For training, please refer to tinynet or timm.
This repo provides pretrained models and inference code for PyTorch:
- PyTorch: ./tinynet_pytorch with pretrained model.
- We also opensource training code on MindSpore Model Zoo.
This repo provides training code of TNT (Transformer in Transformer) for PyTorch:
- PyTorch: ./tnt_pytorch.
- We also opensource code on MindSpore Model Zoo.
@inproceedings{ghostnet,
title={GhostNet: More Features from Cheap Operations},
author={Han, Kai and Wang, Yunhe and Tian, Qi and Guo, Jianyuan and Xu, Chunjing and Xu, Chang},
booktitle={CVPR},
year={2020}
}
@inproceedings{tinynet,
title={Model Rubik’s Cube: Twisting Resolution, Depth and Width for TinyNets},
author={Han, Kai and Wang, Yunhe and Zhang, Qiulin and Zhang, Wei and Xu, Chunjing and Zhang, Tong},
booktitle={NeurIPS},
year={2020}
}
@article{tnt,
title={Transformer in transformer},
author={Han, Kai and Xiao, An and Wu, Enhua and Guo, Jianyuan and Xu, Chunjing and Wang, Yunhe},
journal={arXiv preprint arXiv:2103.00112},
year={2021}
}
This repo provides the TensorFlow/PyTorch code of GhostNet. Other versions and applications can be found in the following: