COOL-FST TensorFlow
Add styles from any pattern to any photo in a fraction of a second!
This project is inspired by the work of lengstrom in fast-style-transfer
Our implementation is based off of a combination of Gatys' A Neural Algorithm of Artistic Style, Johnson's Perceptual Losses for Real-Time Style Transfer and Super-Resolution, and Ulyanov's Instance Normalization.
Our implementation uses TF and Keras to train a fast style transfer network. We use roughly the same transformation network as described in Johnson, except that batch normalization is replaced with Ulyanov's instance normalization, and the scaling/offset of the output tanh
layer is slightly different. We use a loss function close to the one described in Gatys, using VGG19 instead of VGG16 and typically using "shallower" layers than in Johnson's implementation (e.g. we use relu1_1
rather than relu1_2
). Empirically, this results in larger scale style features in transformations.
Use style.py
to train a new style transfer network. Run python style.py
to view all the possible parameters. Training takes 4-6 hours on a Maxwell Titan X. More detailed documentation here. Before you run this, you should run setup.sh
. Example usage:
Use evaluate.py
to evaluate a style transfer network. Run python evaluate.py
to view all the possible parameters. Evaluation takes 100 ms per frame (when batch size is 1) on a Maxwell Titan X. More detailed documentation here. Takes several seconds per frame on a CPU. Models for evaluation are located here. Example usage:
@misc{engstrom2016faststyletransfer,
author = {Logan Engstrom},
title = {Fast Style Transfer},
year = {2016},
howpublished = {\url{https://github.com/lengstrom/fast-style-transfer/}},
note = {commit xxxxxxx}
}