The master branch works with PyTorch 1.3+.
MMAction is an open-source toolbox for action understanding based on PyTorch. It is a part of the OpenMMLab project developed by Multimedia Laboratory, CUHK.
-
Modular design
We decompose the action understanding framework into different components and one can easily construct a customized action understanding framework by combining different modules.
-
Support for various datasets
The toolbox directly supports multiple datasets, UCF101, Kinetics-400, Something-Something V1&V2, Moments in Time, Multi-Moments in Time, THUMOS14, etc.
-
Support for multiple action understanding frameworks
MMAction implements popular frameworks for action understanding:
-
For action recognition, various algorithms are implemented, including TSN, TSM, R(2+1)D, I3D, SlowOnly, SlowFast.
-
For temporal action localization, we implement BSN, BMN.
-
This project is released under the Apache 2.0 license.
Benchmark with other repos are available on benchmark.md.
Results and models are available in the README.md of each method's config directory.
Supported methods for action recognition:
Supported methods for action localization:
Please refer to install.md for installation.
Please refer to data_preparation.md for a general knowledge of data preparation.
Please see getting_started.md for the basic usage of MMAction. There are also tutorials for finetuning models, adding new dataset, designing data pipeline, and adding new modules.
We appreciate all contributions to improve MMAction. Please refer to CONTRIBUTING.md for the contributing guideline.
MMAction is an open source project that is contributed by researchers and engineers from various colleges and companies. We appreciate all the contributors who implement their methods or add new features, as well as users who give valuable feedbacks. We wish that the toolbox and benchmark could serve the growing research community by providing a flexible toolkit to reimplement existing methods and develop their own new models.