Keras examples
Vision models examples
mnist_mlp Trains a simple deep multi-layer perceptron on the MNIST dataset.
mnist_cnn Trains a simple convnet on the MNIST dataset.
cifar10_cnn Trains a simple deep CNN on the CIFAR10 small images dataset.
cifar10_resnet Trains a ResNet on the CIFAR10 small images dataset.
conv_lstm Demonstrates the use of a convolutional LSTM network.
image_ocr Trains a convolutional stack followed by a recurrent stack and a CTC logloss function to perform optical character recognition (OCR).
mnist_acgan Implementation of AC-GAN (Auxiliary Classifier GAN) on the MNIST dataset
mnist_hierarchical_rnn Trains a Hierarchical RNN (HRNN) to classify MNIST digits.
mnist_siamese Trains a Siamese multi-layer perceptron on pairs of digits from the MNIST dataset.
mnist_swwae Trains a Stacked What-Where AutoEncoder built on residual blocks on the MNIST dataset.
mnist_transfer_cnn Transfer learning toy example.
Text & sequences examples
addition_rnn Implementation of sequence to sequence learning for performing addition of two numbers (as strings).
babi_rnn Trains a two-branch recurrent network on the bAbI dataset for reading comprehension.
babi_memnn Trains a memory network on the bAbI dataset for reading comprehension.
imdb_bidirectional_lstm Trains a Bidirectional LSTM on the IMDB sentiment classification task.
imdb_cnn Demonstrates the use of Convolution1D for text classification.
imdb_cnn_lstm Trains a convolutional stack followed by a recurrent stack network on the IMDB sentiment classification task.
imdb_fasttext Trains a FastText model on the IMDB sentiment classification task.
imdb_lstm Trains an LSTM model on the IMDB sentiment classification task.
lstm_stateful Demonstrates how to use stateful RNNs to model long sequences efficiently.
pretrained_word_embeddings Loads pre-trained word embeddings (GloVe embeddings) into a frozen Keras Embedding layer, and uses it to train a text classification model on the 20 Newsgroup dataset.
reuters_mlp Trains and evaluate a simple MLP on the Reuters newswire topic classification task.
Generative models examples
lstm_text_generation Generates text from Nietzsche's writings.
conv_filter_visualization Visualization of the filters of VGG16, via gradient ascent in input space.
deep_dream Deep Dreams in Keras.
neural_doodle Neural doodle.
neural_style_transfer Neural style transfer.
variational_autoencoder Demonstrates how to build a variational autoencoder.
variational_autoencoder_deconv Demonstrates how to build a variational autoencoder with Keras using deconvolution layers.
Examples demonstrating specific Keras functionality
antirectifier Demonstrates how to write custom layers for Keras.
mnist_sklearn_wrapper Demonstrates how to use the sklearn wrapper.
mnist_irnn Reproduction of the IRNN experiment with pixel-by-pixel sequential MNIST in "A Simple Way to Initialize Recurrent Networks of Rectified Linear Units" by Le et al.
mnist_net2net Reproduction of the Net2Net experiment with MNIST in "Net2Net: Accelerating Learning via Knowledge Transfer".
reuters_mlp_relu_vs_selu Compares self-normalizing MLPs with regular MLPs.
mnist_tfrecord MNIST dataset with TFRecords, the standard TensorFlow data format.