Sit2stand.ai -- scripts replicating data analaysis in the paper

Quickstart

Extraction of movement metrics from trajectories (reproducing our paper)

  1. Install jupyter notebook and requirements with
pip install -r requirements.txt
  1. Download processed videos (trajectories of body landmarks) from Google Drive and unzip them to the videos/np/ directory
  2. Run GetMetrics.ipynb to derive all metrics used for statistical analysis -- they will be saved in results.csv

Data analysis

  1. Start an RStudio project in stats directory
  2. Run sit2stand_clean-data_v15.Rmd notebook with results.csv derived previously or an already provided dataClean.csv file.

Optional: Processing videos

If you don't want to use our preprocessed video trajectories, you can process videos on your own. Note that results may be slightly different from ours since we only share deidentified videos, while we ran open pose on raw videos.

  1. Download videos from our Google Drive.
  2. Run OpenPose on videos, for example as we did here
  3. Process videos to get x,y trajectories of keypoints and save them as numpy arrays as we did here