A pure Julia machine learning framework.
MLJ aims to be a flexible framework for combining and tuning machine learning models, written in the high performance, rapid development, scientific programming language, Julia.
MLJ is in a relatively early stage of development and welcomes new collaborators. Click here if you are interested in contributing, or if you are interested in implementing the MLJ interface for an existing Julia machine learning algorithm.
The MLJ project is partly inspired by MLR.
A list of models implementing the MLJ interface: MLJRegistry
In the Julia REPL:
]add "https://github.com/wildart/TOML.jl"
add "https://github.com/alan-turing-institute/MLJBase.jl"
add "https://github.com/alan-turing-institute/MLJModels.jl"
add "https://github.com/alan-turing-institute/MLJ.jl"
A docker image with installation instructions is also available.
-
Automated tuning of hyperparameters, including composite models with nested parameters. Tuning implemented as a wrapper, allowing composition with other meta-algorithms. ✔
-
Option to tune hyperparameters using gradient descent and automatic differentiation (for learning algorithms written in Julia).
-
Data agnostic: Train models on any data supported by the Tables.jl interface. ✔
-
Intuitive syntax for building arbitrarily complicated learning networks .✔
-
Learning networks can be exported as self-contained composite models ✔, but common networks (e.g., linear pipelines, stacks) come ready to plug-and-play.
-
Performant parallel implementation of large homogeneous ensembles of arbitrary models (e.g., random forests). ✔
-
Task interface matches machine learning problem to available models. ✔ (mostly)
-
Benchmarking a battery of assorted models for a given task.
-
Automated estimates of cpu and memory requirements for given task/model.
See here.
- The ScikitLearn SVM models will not work under Julia 1.0.3 but do work under Julia 1.1 due to Issue #29208
Get started here, or take the MLJ tour.
Predecessors of the current package are AnalyticalEngine.jl and Orchestra.jl, and Koala.jl. Work continued as a research study group at the University of Warwick, beginning with a review of existing ML Modules that were available in Julia at the time (in-depth, overview).
Further work culminated in the first MLJ proof-of-concept