[WACV 2024] Separable Self and Mixed Attention Transformers for Efficient Object Tracking
Primary LanguagePythonApache License 2.0Apache-2.0