This is official implementation of arXiv paper.
- Create a virtual environment using
conda
orvirtualenv
. - Install the package. (The version may be various depends on your devices.)
pip install -r requirements.txt
- Download dataset, and sort by yourself like below structure.
. ├── datset │ ├── Rain100L │ │ ├── test │ │ │ ├── input │ │ │ ├── gt
- Generate txt files which list the paths of images that you want to deal with. See
dataset/Rain100L/testing.txt
as example.
This part is implemented by Matlab, which modify from the source code of TIP 2012.
- Make sure the requirement packages (such as SPAMS) is installed.
- Modfiy
file_path
andrain_component_path
inrain_mask/extract_mask.m
and run it. - Modfiy
src_dir
andbinary_mask_dir
inbinarization.py
and run the commandcd rain_mask python binarazation.py
Modify dataset_path
, save_path
, and target_path
in the stochastic_filling.py
and run the command.
python stochastic_filling.py
Modify the default values of image_dir_path
, data_path
, and save_dir_path
in the main.py
and run the command.
cd ../
python main.py
or just use command line argparser
cd ../
python main.py --image_dir_path './dataset/' --data_path './dataset/Rain100L/testing.txt' --save_dir_path './Results/Rain100L/test/SRL-Derain/'
PS: For the rainy image that is too big to derained, we will use main_overlapped.py
instead of main.py
. The image will be random cropped during training and overlapped inference by patches.
In ablation study, we also provide the derained results by using multiple training strategy, where the agents are train on training set and inference on testing set.
python main_multiple.py --mode 'train' --data_path './dataset/Rain100L/training.txt' --save_dir_path './Results/Rain100L/test/SRL-Derain_multiple/'
python main_multiple.py --mode 'test' --data_path './dataset/Rain100L/testing.txt' --save_dir_path './Results/Rain100L/test/SRL-Derain_multiple/derained_result/' --model_weight_path './Results/Rain100L/test/SRL-Derain_multiple/model_weight/last/model.npz'
The demo video is available at google drive.