/OpenGait

A flexible and extensible framework for gait recognition. You can focus on designing your own models and comparing with state-of-the-arts easily with the help of OpenGait.

Primary LanguagePython

logo

nmbgcl

📣📣📣 SUSTech1K relseased, pls checking the tutorial. 📣📣📣

🎉🎉🎉 OpenGait has been accpected by CVPR2023 as a highlight paper! 🎉🎉🎉

OpenGait is a flexible and extensible gait recognition project provided by the Shiqi Yu Group and supported in part by WATRIX.AI.

What's New

Our Publications

  • [TPAMI 2023] Learning Gait Representation from Massive Unlabelled Walking Videos: A Benchmark, Paper, Dataset and Code(Coming soon).
  • [CVPR 2023] LidarGait: Benchmarking 3D Gait Recognition with Point Clouds, Paper, Dataset and Code.
  • [CVPR 2023 Highlight] OpenGait: Revisiting Gait Recognition Toward Better Practicality, Paper, Code.
  • [ECCV 2022] GaitEdge: Beyond Plain End-to-end Gait Recognition for Better Practicality, Paper, Code.

A Real Gait Recognition System: All-in-One-Gait

probe1-After

The workflow of All-in-One-Gait involves the processes of pedestrian tracking, segmentation and recognition. See here for details.

Highlighted features

Getting Started

Please see 0.get_started.md. We also provide the following tutorials for your reference:

Model Zoo

Hugging Face Models

Results of appearance-based gait recognition are available here.

Results of pose-based gait recognition are available here.

Authors:

Open Gait Team (OGT)

Acknowledgement

Citation

@InProceedings{Fan_2023_CVPR,
    author    = {Fan, Chao and Liang, Junhao and Shen, Chuanfu and Hou, Saihui and Huang, Yongzhen and Yu, Shiqi},
    title     = {OpenGait: Revisiting Gait Recognition Towards Better Practicality},
    booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
    month     = {June},
    year      = {2023},
    pages     = {9707-9716}
}

Note: This code is only used for academic purposes, people cannot use this code for anything that might be considered commercial use.