/CLOCS

[ICML 2021] Self-supervised pre-training of AI models on ECG data with CLOCS

Primary LanguagePythonOtherNOASSERTION

Shield: CC BY-NC-SA 4.0

This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.

CC BY-NC-SA 4.0

Self-Supervised Pre-Training of Networks with CLOCS

CLOCS is a patient-specific contrastive learning method that can be used to pre-train neural networks on medical time-series data. It can improve the generalization performance of such networks when trained and deployed on downstream supervised tasks with limited labelled data.

This repository contains a PyTorch implementation of CLOCS. For details, see CLOCS: Contrastive Learning of Cardiac Signals Across Space, Time, and Patients. [ICML paper] [blogpost] [video]

Requirements

The CLOCS code requires the following:

  • Python 3.6 or higher
  • PyTorch 1.0 or higher

Datasets

Download

The datasets can be downloaded from the following links:

  1. PhysioNet 2020
  2. Chapman
  3. Cardiology
  4. PhysioNet 2017

Pre-processing

In order to pre-process the datasets appropriately for CLOCS and the downstream supervised tasks, please refer to the following repository

Training

To train the model(s) in the paper, run this command:

python run_experiments.py

Evaluation

To evaluate the model(s) in the paper, run this command:

python run_experiments.py

Citing

If you use our code in your research, please consider citing with the following BibTex.

@inproceedings{kiyasseh2021clocs,
  title={Clocs: Contrastive learning of cardiac signals across space, time, and patients},
  author={Kiyasseh, Dani and Zhu, Tingting and Clifton, David A},
  booktitle={International Conference on Machine Learning},
  pages={5606--5615},
  year={2021},
  organization={PMLR}
}