/pytorch-lightning-bolts

PyTorch Lightning Bolts is a community contribution for AI/ML researchers.

Primary LanguagePythonOtherNOASSERTION

Logo

PyTorch Lightning Bolts

Pretrained SOTA Deep Learning models, callbacks and more for research and production with PyTorch Lightning and PyTorch

PyPI Status PyPI Status codecov

Documentation Status Slack license Next Release


Trending contributors

Continuous Integration

System / PyTorch ver. 1.4 (min. req.) 1.5 (latest)
Linux py3.6 / py3.7 / py3.8 CI testing CI testing
OSX py3.6 / py3.7 / py3.8 CI testing CI testing
Windows py3.6 / py3.7 / py3.8 wip wip

Install

pip install pytorch-lightning-bolts

Docs

What is Bolts

Bolts is a Deep learning research and production toolbox of:

  • SOTA pretrained models.
  • Model components.
  • Callbacks.
  • Losses.
  • Datasets.

Main Goals of Bolts

The main goal of Bolts is to enable rapid model idea iteration.

Example 1: Finetuning on data

from pl_bolts.models.self_supervised import SimCLR
from pl_bolts.models.self_supervised.simclr.transforms import SimCLRTrainDataTransform, SimCLREvalDataTransform
import pytorch_lightning as pl

# data
train_data = DataLoader(MyDataset(transforms=SimCLRTrainDataTransform(input_height=32)))
val_data = DataLoader(MyDataset(transforms=SimCLREvalDataTransform(input_height=32)))

# model
model = SimCLR(pretrained='imagenet2012')

# train!
trainer = pl.Trainer(gpus=8)
trainer.fit(model, train_data, val_data)

Example 2: Subclass and ideate

from pl_bolts.models import ImageGPT
from pl_bolts.self_supervised import SimCLR

class VideoGPT(ImageGPT):

    def training_step(self, batch, batch_idx):
        x, y = batch
        x = _shape_input(x)

        logits = self.gpt(x)
        simclr_features = self.simclr(x)

        # -----------------
        # do something new with GPT logits + simclr_features
        # -----------------

        loss = self.criterion(logits.view(-1, logits.size(-1)), x.view(-1).long())

        logs = {"loss": loss}
        return {"loss": loss, "log": logs}

Who is Bolts for?

  • Corporate production teams
  • Professional researchers
  • Ph.D. students
  • Linear + Logistic regression heroes

I don't need deep learning

Great! We have LinearRegression and LogisticRegression implementations with numpy and sklearn bridges for datasets! But our implementations work on multiple GPUs, TPUs and scale dramatically...

Check out our Linear Regression on TPU demo

from pl_bolts.models.regression import LinearRegression
from pl_bolts.datamodules import SklearnDataModule

# sklearn dataset
X, y = load_boston(return_X_y=True)
loaders = SklearnDataModule(X, y)

model = LinearRegression(input_dim=13)
trainer = pl.Trainer(num_tpu_cores=1)
trainer.fit(model, loaders.train_dataloader(), loaders.val_dataloader())
trainer.test(test_dataloaders=loaders.test_dataloader())

Is this another model zoo?

No!

Bolts is unique because models are implemented using PyTorch Lightning and structured so that they can be easily subclassed and iterated on.

For example, you can override the elbo loss of a VAE, or the generator_step of a GAN to quickly try out a new idea. The best part is that all the models are benchmarked so you won't waste time trying to "reproduce" or find the bugs with your implementation.

Team

Bolts is supported by the PyTorch Lightning team and the PyTorch Lightning community!