KIFNet: Continuous Prediction of Leg Kinematics during Walking using Inertial Sensors, Smart Glasses, and Embedded Computing
This is a repository, containing training source code we used for our paper accepted to ICRA 2023.
In order to run our code you will need:
- Egocentric Vision & Kinematics dataset. Download it and change appropriate data config files to match your paths.
- ml-mobileone package and model weights (we used S0 unfused), that you need to place in
./ml-mobileone/weights/
folder. - An existing Weight & Biases project. We used W&B for experiment tracking and configuration, hence it is required to run our pipeline if you are not willing to modify the code.
Contributors: