/FluxTraining.jl

A flexible neural net training library inspired by fast.ai

Primary LanguageJuliaMIT LicenseMIT

FluxTraining.jl

Docs (master)

A Julia package for using and writing powerful, extensible training loops for deep learning models.

What does it do?

When should you use FluxTraining.jl?

  • You don't want to implement your own metrics tracking and hyperparameter scheduling or insert common training feature here for the 10th time
  • You want to use composable and reusable components that enhance your training loop
  • You want a simple training loop with reasonable defaults that can grow to the needs of your project

How do you use it?

Install like any other Julia package using the package manager:

]add FluxTraining

After installation, import it, create a Learner from a Flux.jl model, data iterators, an optimizer, and a loss function. Finally train with fit!.

using FluxTraining

learner = Learner(model, lossfn)
fit!(learner, 10, (trainiter, validiter))

Next, you may want to read

Acknowledgements

The design of FluxTraining.jl's two-way callbacks is adapted from fastai's training loop.