An interface to various automatic differentiation backends in Julia.
This package provides a backend-agnostic syntax to differentiate functions of the following types:
- allocating:
f(x) = y
- mutating:
f!(y, x) = nothing
- First and second order operators
- In-place and out-of-place differentiation
- Preparation mechanism (e.g. to create a config or tape)
- Cross-backend testing and benchmarking utilities
- Thorough validation on standard inputs and outputs (scalars, vectors, matrices)
We support most of the backends defined by ADTypes.jl:
backend | object |
---|---|
ChainRulesCore.jl | AutoChainRules(ruleconfig) |
Diffractor.jl | AutoDiffractor() |
Enzyme.jl | AutoEnzyme(Enzyme.Forward) or AutoEnzyme(Enzyme.Reverse) |
FiniteDiff.jl | AutoFiniteDiff() |
FiniteDifferences.jl | AutoFiniteDifferences(fdm) |
ForwardDiff.jl | AutoForwardDiff() |
PolyesterForwardDiff.jl | AutoPolyesterForwardDiff(; chunksize) |
ReverseDiff.jl | AutoReverseDiff() |
Tracker.jl | AutoTracker() |
Zygote.jl | AutoZygote() |
We also provide one additional backend:
backend | object |
---|---|
FastDifferentiation.jl | AutoFastDifferentiation() |
julia> import ADTypes, ForwardDiff
julia> using DifferentiationInterface
julia> backend = ADTypes.AutoForwardDiff();
julia> f(x) = sum(abs2, x);
julia> value_and_gradient(f, backend, [1., 2., 3.])
(14.0, [2.0, 4.0, 6.0])
- AbstractDifferentiation.jl is the original inspiration for DifferentiationInterface.jl.
- AutoDiffOperators.jl is an attempt to bridge ADTypes.jl with AbstractDifferentiation.jl.