/LLIE-CLIP

Primary LanguagePythonMIT LicenseMIT

CLIP: Cascaded Learning with Inception Pattern on Low-Light Image Enhancement

A group project of CS7303 (2022-2023 autumn)

The original model are based on SCI

requirements

python3.10
pytorch==1.11.0
wandb

how to run

  1. Download our combined datasets from Baiduyun (code: 0il6), unzip them and place them under directory Datasets. You can also use your own datasets and modify the training script. The dataset we use are selected from DarkFace, GLADNet and LOL Dataset.

  2. To run traditional algorithms

python test_classic.py --data DATA --mode MODE 
  1. To run CLIP model with slurm and DDP, simply use
sbatch run_ddp_new.sh 

You can run this script with shell locally, by setting SLURM_ARRAY_TASK_ID to any preset index.

sh run_ddp_new.sh 

You could also change one or multiple arguments in the script, and you can find their descriptions in train_ddp.py.