/Castling-ViT

[CVPR 2023] Castling-ViT: Compressing Self-Attention via Switching Towards Linear-Angular Attention During Vision Transformer Inference

Primary LanguagePythonApache License 2.0Apache-2.0

Stargazers