Mahmoud Afifi1, Brian Price2, Scott Cohen2, and Michael S. Brown1
1York University 2Adobe Research
Reference code for the paper When Color Constancy Goes Wrong: Correcting Improperly White-Balanced Images. Mahmoud Afifi, Brian Price, Scott Cohen, and Michael S. Brown, CVPR 2019. If you use this code or our dataset, please cite our paper:
@inproceedings{afifi2019color,
title={When Color Constancy Goes Wrong: Correcting Improperly White-Balanced Images},
author={Afifi, Mahmoud and Price, Brian and Cohen, Scott and Brown, Michael S},
booktitle={Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition},
pages={1535--1544},
year={2019}
}
The original source code of our paper was written in Matlab. We provide a Python version of our code. We tried to make both versions identical. However, there is no guarantee that the Python version will give exactly the same results. The difference should be due to rounding error when we convert our model to Python or differences between Matlab and OpenCV in reading compressed images.
- Run
install_.m
- Run
demo.m
to process a single image ordemo_images.m
to process all images in a directory. - Check
evaluation_examples.m
for examples of reporting errors using different evaluation metrics. Also, this code includes an example of how to hide the color chart for Set1 images.
- Requirements: numpy, opencv-python, and skimage (skimage is required for evaluation code only).
- Run
demo.py
to process a single image ordemo_images.py
to process all images in a directory. - Check
evaluation_examples.py
for examples of reporting errors using different evaluation metrics. Also, this code includes an example of how to hide the color chart for Set1 images.
We provide a Matlab GUI to help tuning our parameters in an interactive way. Please, check demo_GPU.m
.
K
: Number of nearest neighbors in the KNN search (Sec. 3.4 in the paper) -- change its value to enhance the results.sigma
: The fall-off factor for KNN blending (Eq. 8 in the paper) -- change its value to enhance the results.device
: GPU or CPU (provided for Matlab version only).gamut_mapping
: Mapping pixels in-gamut either using scaling (gamut_mapping= 1
) or clipping (gamut_mapping= 2
). In the paper, we used the clipping options to report our results, but the scaling option gives compelling results in some cases (esp., with high-saturated/vivid images).upgraded_model
andupgraded
: To load our upgraded model, useupgraded_model=1
in Matlab orupgraded=1
in Python. The upgraded model has new training examples. In our paper results, we did not use this model. However, our online demo uses it.
In the paper, we mentioned that our dataset is over 65,000 images. We added two additional sets of rendered images, for a total of 105,638 rendered images. You can download our dataset from here.
Try the interactive demo by uploading your photo or paste a URL for a photo from the web.
For more information, please visit our project page