/tensorhub

TensorHub is a library built on top of TensorFlow 2.0 to provide simple, modular and repeatable abstractions to accelerate deep learning research.

Primary LanguagePythonApache License 2.0Apache-2.0

LOGO

GitHub LICENSE GitHub code size in bytes GitHub repo size GitHub language count GitHub last commit

You have just found TensorHub!

TensorHub is a deep learning API written in Python, running on top of the machine learning platform TensorFlow 2 to provide simple, modular and repeatable abstractions to accelerate deep learning research. TensorHub is designed to be simple to understand, easy to write and quick to change.

Unlike many frameworks TensorHub is extremely flexible about how to use modules. Modules are designed to be self contained and entirely decoupled from one another.

Use TensorHub if you need a deep learning library that:

  • Reproducibility - Reproduce the results of existing pre-training models (such as ResNet, VGG, BERT, XLNet).

  • Modularity - Clear and robust interface allows users to combine modules with as few restrictions as possible.

  • Fast - Our custom utilities and layers are made from the ground up to support pre-existing standard frameworks like TensorFlow and Keras with efficiency in mind.

  • Prototyping - Code less build more. Apply modular blocks to create fast prototypes with the help of pre-cooked models, custom layers and utilities support.

  • Platform Independent - Run your model on CPU, single GPU or using a distributed training strategy on top of TensorFlow 2.

Installation & Compatibility

To use, simply install from PyPI via pip:

$ pip install tensorhub

TensorHub is compatible with:

  • Python 3.8+
  • TensorFlow 2.8.0

Getting Started

The ideas behind deep learning are simple, so why should their implementation be painful?

TensorHub ships with a number of built in modules like pre-built models and advance layers that can be used easily.

Models on a Plate (MoaP)

MoaP's are deep learning models that are made available with TensorHub. These models can be used for training, feature extraction, fine-tuning or as you wish.

Layers

Layers are the basic building blocks of neural networks in TensorHub. A layer consists of a tensor-in tensor-out computation function (the layer's call method) and some state, held in TensorFlow variables (the layer's weights).

TensorHub provides customs layers conceptualized from proven and high performing deep learning models. This helps to take advantage of core magic blocks from high performing SOTA models with smaller or a different neural architecture.

Support

You can also post bug reports and feature requests (only) in GitHub issues. Make sure to read our guidelines first.

forthebadge made-with-python Love