/kifnet

Code for our ICRA23 paper "Continuous Prediction of Leg Kinematics during Walking using Inertial Sensors, Smart Glasses, and Embedded Computing"

Primary LanguagePythonMIT LicenseMIT

KIFNet: Continuous Prediction of Leg Kinematics during Walking using Inertial Sensors, Smart Glasses, and Embedded Computing

This is a repository, containing training source code we used for our paper accepted to ICRA 2023.

Model architecture

In order to run our code you will need:

  1. Egocentric Vision & Kinematics dataset. Download it and change appropriate data config files to match your paths.
  2. ml-mobileone package and model weights (we used S0 unfused), that you need to place in ./ml-mobileone/weights/ folder.
  3. An existing Weight & Biases project. We used W&B for experiment tracking and configuration, hence it is required to run our pipeline if you are not willing to modify the code.

Contributors: