/TD-GCN-Gesture

[TMM 2024] Implementation of the paper “Temporal Decoupling Graph Convolutional Network for Skeleton-based Gesture Recognition”.

Primary LanguagePython

TD-GCN-Gesture

This is the official repo of TD-GCN and our work is accepted by IEEE Transactions on Multimedia (TMM).
Jinfu Liu, Xinshun Wang, Can Wang, Yuan Gao, Mengyuan Liu. Temporal Decoupling Graph Convolutional Network for Skeleton-based Gesture Recognition. IEEE Transactions on Multimedia (TMM), 2023. PWC
PWC
PWC
PWC
PWC
PWC

Prerequisites

You can install all dependencies by running pip install -r requirements.txt
Then, you need to install torchlight by running pip install -e torchlight

Data Preparation

Download four datasets:

  1. SHREC’17 Track dataset from http://www-rech.telecom-lille.fr/shrec2017-hand/
  2. DHG-14/28 dataset from http://www-rech.telecom-lille.fr/DHGdataset/
  3. NTU RGB+D 60 Skeleton dataset from https://rose1.ntu.edu.sg/dataset/actionRecognition/
  4. NW-UCLA dataset from Download NW-UCLA dataset
  5. Put downloaded data into the following directory structure:
- data/
  - shrec/
    - shrec17_dataset/
	  - HandGestureDataset_SHREC2017/
	    - gesture_1
	      ...
  - DHG14-28/
    - DHG14-28_dataset/
	  - gesture_1
	    ...
  - NW-UCLA/
    - all_sqe
      ...
  - ntu/
    - nturgbd_raw/
	  - nturgb+d_skeletons
            ...

SHREC’17 Track dataset:

  1. First, extract all files to /data/shrec/shrec17_dataset
  2. Then, run python gen_traindataset.py and python gen_testdataset.py

DHG-14/28 dataset:

  1. First, extract all files to ./data/DHG14-28/DHG14-28_dataset
  2. Then, run python python gen_dhgdataset.py

NTU RGB+D 60 dataset

  1. First, extract all skeleton files to ./data/ntu/nturgbd_raw
  2. Then, run python get_raw_skes_data.py, python get_raw_denoised_data.py and python seq_transformation.py in sequence

NW-UCLA dataset

  1. Move folder all_sqe to ./data/NW-UCLA

Training

You can change the configuration in the yaml file and in the main function. We also provide four default yaml configuration files.

SHREC’17 Track dataset:

Run python main.py --device 0 --config ./config/shrec17/shrec17.yaml

DHG-14/28 dataset:

Run python main.py --device 0 --config ./config/dhg14-28/DHG14-28.yaml

NTU RGB+D 60 dataset:

On the benchmark of cross-view, run python main.py --device 0 --config ./config/nturgbd-cross-view/default.yaml
On the benchmark of cross-subject, run python main.py --device 0 --config ./config/nturgbd-cross-subject/default.yaml

NW-UCLA dataset:

Run python main.py --device 0 --config ./config/ucla/nw-ucla.yaml

Testing

We provide several trained weight files and place them in the checkpoints folder.

python main.py --device 0 --config <config.yaml> --phase test --weights <work_dir>/<weight.pt>

Citation

@ARTICLE{10113233,
  author={Liu, Jinfu and Wang, Xinshun and Wang, Can and Gao, Yuan and Liu, Mengyuan},
  journal={IEEE Transactions on Multimedia}, 
  title={Temporal Decoupling Graph Convolutional Network for Skeleton-based Gesture Recognition}, 
  year={2024}
}

Our project is based on the DSTA-Net, CTR-GCN.

Contact

For any questions, feel free to contact: liujf69@mail2.sysu.edu.cn