TensorFlow Examples
Code examples for some popular machine learning algorithms, using TensorFlow library. This tutorial is designed to easily dive into TensorFlow, through examples. It includes both notebook and code with explanations.
Tutorial index
1 - Introduction
2 - Basic Classifiers
- Nearest Neighbor (notebook) (code)
- Linear Regression (notebook) (code)
- Logistic Regression (notebook) (code)
3 - Neural Networks
- Multilayer Perceptron (notebook) (code)
- Convolutional Neural Network (notebook) (code)
- AlexNet (notebook) (code)
- Recurrent Neural Network (LSTM) (notebook) (code)
- Bidirectional Recurrent Neural Network (LSTM) (notebook) (code)
- AutoEncoder (code)
4 - Multi GPU
5 - User Interface (Tensorboard)
More Examples
The following examples are coming from TFLearn, a library that provides a simplified interface for TensorFlow. You can have a look, there are many examples and pre-built operations and layers.
Basics
- Linear Regression. Implement a linear regression using TFLearn.
- Logical Operators. Implement logical operators with TFLearn (also includes a usage of 'merge').
- Weights Persistence. Save and Restore a model.
- Fine-Tuning. Fine-Tune a pre-trained model on a new task.
- Using HDF5. Use HDF5 to handle large datasets.
- Using DASK. Use DASK to handle large datasets.
Computer Vision
- Multi-layer perceptron. A multi-layer perceptron implementation for MNIST classification task.
- Convolutional Network (MNIST). A Convolutional neural network implementation for classifying MNIST dataset.
- Convolutional Network (CIFAR-10). A Convolutional neural network implementation for classifying CIFAR-10 dataset.
- Network in Network. 'Network in Network' implementation for classifying CIFAR-10 dataset.
- Alexnet. Apply Alexnet to Oxford Flowers 17 classification task.
- VGGNet. Apply VGG Network to Oxford Flowers 17 classification task.
- RNN Pixels. Use RNN (over sequence of pixels) to classify images.
- Residual Network (MNIST). A residual network with shallow bottlenecks applied to MNIST classification task.
- Residual Network (CIFAR-10). A residual network with deep bottlenecks applied to CIFAR-10 classification task.
- Auto Encoder. An auto encoder applied to MNIST handwritten digits.
Natural Language Processing
- Reccurent Network (LSTM). Apply an LSTM to IMDB sentiment dataset classification task.
- Bi-Directional LSTM. Apply a bi-directional LSTM to IMDB sentiment dataset classification task.
- City Name Generation. Generates new US-cities name, using LSTM network.
- Shakespeare Scripts Generation. Generates new Shakespeare scripts, using LSTM network.
Dependencies
tensorflow
numpy
matplotlib
cuda (to run examples on GPU)
tflearn (if using tflearn examples)
For more details about TensorFlow installation, you can check Setup_TensorFlow.md
Dataset
Some examples require MNIST dataset for training and testing. Don't worry, this dataset will automatically be downloaded when running examples (with input_data.py). MNIST is a database of handwritten digits, with 60,000 examples for training and 10,000 examples for testing. (Website: http://yann.lecun.com/exdb/mnist/)