A library of functions for human pose estimation with event-driven cameras
Please cite:
@InProceedings{Goyal_2023_CVPR,
author = {Goyal, Gaurvi and Di Pietro, Franco and Carissimi, Nicolo and Glover, Arren and Bartolozzi, Chiara},
title = {MoveEnet: Online High-Frequency Human Pose Estimation With an Event Camera},
booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) Workshops},
month = {June},
year = {2023},
pages = {4023-4032}
}
also for the eH3.6m dataset:
https://zenodo.org/record/7842598
and checkout other publications
Please contribute your event-driven HPE application and datasets to enable comparisons!
Compile and link the core C++ library in your application to use the event-based human pose estimation functions including:
- joint detectors: OpenPose built upon greyscales formed from events
- joint velocity estimation @>500Hz
- asynchronous pose fusion of joint velocity and detection
- event representation methods to be compatible with convolutional neural networks.
Importable python libraries for joint detection
- event-based movenet: MoveEnet built on PyTorch
Some example applications are available giving ideas on how to use the HPE-core libraries
Python scripts can be used to compare different detectors and velocity estimation combinations
Scripts to convert datasets into common formats to easily facilitate valid comparisons
Event-driven Perception for Robotics
@INPROCEEDINGS{9845526,
author={Carissimi, Nicolò and Goyal, Gaurvi and Pietro, Franco Di and Bartolozzi, Chiara and Glover, Arren},
booktitle={2022 8th International Conference on Event-Based Control, Communication, and Signal Processing (EBCCSP)},
title={Unlocking Static Images for Training Event-driven Neural Networks},
year={2022},
pages={1-4},
doi={10.1109/EBCCSP56922.2022.9845526}}