/PSTL

This is an official PyTorch implementation of "Self-supervised Action Representation Learning from Partial Spatio-Temporal Skeleton Sequences" in AAAI2023.

Primary LanguagePythonMIT LicenseMIT

PSTL

This is an official PyTorch implementation of "Self-supervised Action Representation Learning from Partial Spatio-Temporal Skeleton Sequences" in AAAI2023 (Oral).

[Paper]

SkeletonBT

SkeletonBT

Framework

PSTL

Requirements

python = 3.7 torch = 1.11.0+cu113

Installation

# Install the python libraries
$ cd PSTL
$ pip install -r requirements.txt

Data Preparation

We apply the same dataset processing as AimCLR.
You can also download the file folder in BaiduYun link:

The code: pstl

Training & Testing

Example for unsupervised pre-training on NTU-60 xsub datasets.
You can change some settings of config.py.

# pre-training
$ python procedure.py with 'train_mode="pretrain"'

# linear evaluation
$ python procedure.py with 'train_mode="lp"'

Reference

If you find our paper and repo useful, please cite our paper. Thanks!

@article{zhou2023self,
  title={Self-supervised Action Representation Learning from Partial Spatio-Temporal Skeleton Sequences},
  author={Zhou, Yujie and Duan, Haodong and Rao, Anyi and Su, Bing and Wang, Jiaqi},
  journal={arXiv preprint arXiv:2302.09018},
  year={2023}
}

Acknowledgement

  • The framework of our code is based on MS2L.
  • The encoder is based on ST-GCN.

Licence

This project is licensed under the terms of the MIT license.

Contact

For any questions, feel free to contact: yujiezhou@ruc.edu.cn