Experiment that visualizes the patterns learned by a neural network.
python main.py --num_steps=900 --learning_rate=0.001 --scaling=False
python main.py --num_steps=300 --learning_rate=0.001 --scaling=True --num_octaves=3
python main.py --num_steps=300 \
--learning_rate=0.001 \
--scaling=True \
--URL=1024px-Van_Gogh_-_Starry_Night_-_Google_Art_Project.jpg
Training Without scaling | Same number of steps but add scaling |
---|---|
You can see that without scaling's output
- Noisy.
- Low resolution.
- Patterns appear like they're all happening at the same granularity.
Addresses these problems by applying gradient ascent at different scales. This will allow patterns generated at smaller scales to be incorporated into patterns at higher scales and filled in with additional detail.
Code: Inception V3 model for Keras
Paper: Rethinking the Inception Architecture for Computer Vision (CVPR 2016)