The official implementation for our TNNLS paper Self-Supervised Time Series Representation Learning via Cross Reconstruction Transformer.
Git clone our repository, and install the required packages with the following command
git clone https://github.com/BobZwr/Cross-Reconstruction-Transformer.git
cd Cross-Reconstruction-Transformer
pip install -r requirements.txt
We use torch=1.13.0.
We provide data_processing.py
to generate phase and magnitude information based on the time-domain data. You can modify this file to adapt it to your own datasets.
We provide the sample script for training and evaluating our CRT
# For Training:
python main.py --ssl True --sl True --load True --seq_len 256 --patch_len 8 --in_dim 9 --n_classes 6
# For Testing:
python main.py --ssl False --sl False --load False --seq_len 256 --patch_len 8 --in_dim 9 --n_classes 6
We also provide a subset of HAR dataset for training and testing.
If you found the codes and datasets are useful, please cite our paper
@article{zhang2023self,
title={Self-Supervised Time Series Representation Learning via Cross Reconstruction Transformer},
author={Zhang, Wenrui and Yang, Ling and Geng, Shijia and Hong, Shenda},
journal={IEEE Transactions on Neural Networks and Learning Systems},
year={2023},
publisher={IEEE}
}