/hb-base

Project structure of Deep Learning experiments

Primary LanguagePython

Project Introduction Project Introduction

hb-base: project structure of Deep Learning experiments

hb-base proposes the structure of a deep learning project. Using TensorFlow's higher api a basic structure is provided. If you start a deep learning project from this project, All you need to do is implement the core.

Why?

  • There are many boilerplate codes when creating a new deep learning project.
  • Recommend using higher APIs (Estimator, Experiment, Dataset and tf.metrics)
  • You can focus the core (model's graph).
  • The training or evaluate results are automatically applied to the TensorBoard.
  • When terminated learning, you can receive a notificatio with a Slack.

Requirements

Project Structure

.
├── config/                 # Config files (.yml, .json) using with hb-config
├── data/                   # dataset path
├── notebooks/              # Prototyping with numpy or tf.interactivesession
├── scripts/                # download dataset using shell scripts
├── concrete_model/         # concrete model architecture graphs (from input to logits)
    ├── __init__.py             # Graph logic
    ├── ...                     # Implements the components or modules
├── data_loader.py          # data_reader, preprocessing, make_batch
├── hook.py                 # training or test hook feature (eg. print_variables, handle training config)
├── main.py                 # define experiment_fn (enable tfdbg)
├── model.py                # define EstimatorSpec      
└── utils.py                # slack notification (incoming-webhook)

Reference : hb-config, Dataset, experiments_fn, EstimatorSpec, tfdbg

Directories below contain dummy data.

  • config/
  • data/
  • notebooks/
  • scripts/

Experiments mode

  • evaluate : Evaluate on the evaluation data.
  • extend_train_hooks : Extends the hooks for training.
  • reset_export_strategies : Resets the export strategies with the new_export_strategies.
  • run_std_server : Starts a TensorFlow server and joins the serving thread.
  • test : Tests training, evaluating and exporting the estimator for a single step.
  • train : Fit the estimator using the training data.
  • train_and_evaluate : Interleaves training and evaluation.

Usage Example

Download as zip then implements concrete model's graph, data_loader and customizing others.

After implements.. Install requirements.

pip install -r requirements.txt

Then, Download dataset and pre-processing.

sh scripts/download_dataset.sh
python data_loader.py --config check-tiny

Finally, start train and evaluate model

python main.py --config check-tiny --mode train_and_evaluate

Tensorboar

tensorboard --logdir logs


Author

Maintainer Dongjun Lee (humanbrain.djlee@gmail.com)