A project template to simplify building and training deep learning models using Keras.
- Getting Started
- Running The Demo Project
- Comet.ml Integration
- Template Details
- Future Work
- Example Projects
- Contributing
- Acknowledgements
This template allows you to simply build and train deep learning models with checkpoints and tensorboard visualization.
In order to use the template you have to:
- Define a data loader class.
- Define a model class that inherits from BaseModel.
- Define a trainer class that inherits.
- Define a configuration file with the parameters needed in an experiment.
- Run the model using:
python main.py -c [path to configuration file]
A simple model for the mnist dataset is available to test the template. To run the demo project:
- Start the training using:
python main.py -c configs/simple_mnist_config.json
- Start Tensorboard visualization using:
tensorboard --logdir=experiments/simple_mnist/logs
This template also supports reporting to Comet.ml which allows you to see all your hyper-params, metrics, graphs, dependencies and more including real-time metric.
Add your API key in the configuration file:
For example: "comet_api_key": "your key here"
Here's how it looks after you start training:
You can also link your Github repository to your comet.ml project for full version control.
├── main.py - here's an example of main that is responsible for the whole pipeline.
│
│
├── base - this folder contains the abstract classes of the project components
│ ├── base_data_loader.py - this file contains the abstract class of the data loader.
│ ├── base_model.py - this file contains the abstract class of the model.
│ └── base_train.py - this file contains the abstract class of the trainer.
│
│
├── model - this folder contains the models of your project.
│ └── simple_mnist_model.py
│
│
├── trainer - this folder contains the trainers of your project.
│ └── simple_mnist_trainer.py
│
|
├── data_loader - this folder contains the data loaders of your project.
│ └── simple_mnist_data_loader.py
│
│
├── configs - this folder contains the experiment and model configs of your project.
│ └── simple_mnist_config.json
│
│
├── datasets - this folder might contain the datasets of your project.
│
│
└── utils - this folder contains any utils you need.
├── config.py - util functions for parsing the config files.
├── dirs.py - util functions for creating directories.
└── utils.py - util functions for parsing arguments.
You need to:
- Create a model class that inherits from BaseModel.
- Override the build_model function which defines your model.
- Call build_model function from the constructor.
You need to:
- Create a trainer class that inherits from BaseTrainer.
- Override the train function which defines the training logic.
Note: To add functionalities after each training epoch such as saving checkpoints or logs for tensorboard using Keras callbacks:
- Declare a callbacks array in your constructor.
- Define an init_callbacks function to populate your callbacks array and call it in your constructor.
- Pass the callbacks array to the fit function on the model object.
Note: You can use fit_generator instead of fit to support generating new batches of data instead of loading the whole dataset at one time.
You need to:
- Create a data loader class that inherits from BaseDataLoader.
- Override the get_train_data() and the get_test_data() functions to return your train and test dataset splits.
Note: You can also define a different logic where the data loader class has a function get_next_batch if you want the data reader to read batches from your dataset each time.
You need to define a .json file that contains your experiment and model configurations such as the experiment name, the batch size, and the number of epochs.
Responsible for building the pipeline.
- Parse the config file
- Create an instance of your data loader class.
- Create an instance of your model class.
- Create an instance of your trainer class.
- Train your model using ".Train()" function on the trainer object.
Create a command line tool for Keras project scaffolding where the user defines a data loader, a model, a trainer and runs the tool to generate the whole project.
Any contributions are welcome including improving the template and example projects.
This project template is based on MrGemy95's Tensorflow Project Template.
Thanks for my colleagues Mahmoud Khaled, Ahmed Waleed and Ahmed El-Gammal who worked on the initial project that spawned this template.