/DifferentiationInterface.jl

An interface to various automatic differentiation backends in Julia.

Primary LanguageJuliaMIT LicenseMIT

DifferentiationInterface Logo

DifferentiationInterface

Build Status Coverage Code Style: Blue DOI

Package Docs
DifferentiationInterface Stable Dev
DifferentiationInterfaceTest Stable Dev

An interface to various automatic differentiation (AD) backends in Julia.

Goal

This package provides a backend-agnostic syntax to differentiate functions of the following types:

  • one-argument functions (allocating): f(x) = y
  • two-argument functions (mutating): f!(y, x) = nothing

Features

  • First- and second-order operators (gradients, Jacobians, Hessians and more)
  • In-place and out-of-place differentiation
  • Preparation mechanism (e.g. to create a config or tape)
  • Built-in sparsity handling
  • Thorough validation on standard inputs and outputs (numbers, vectors, matrices)
  • Testing and benchmarking utilities accessible to users with DifferentiationInterfaceTest

Compatibility

We support all of the backends defined by ADTypes.jl:

Note that in some cases, going through DifferentiationInterface.jl might be slower than a direct call to the backend's API. This is mostly true for Enzyme.jl, whose handling of activities and multiple arguments unlocks additional performance. We are working on this challenge, and welcome any suggestions or contributions. Meanwhile, if differentiation fails or takes too long, consider using Enzyme.jl directly.

Installation

To install the stable version of the package, run the following code in a Julia REPL:

using Pkg

Pkg.add("DifferentiationInterface")

To install the development version, run this instead:

using Pkg

Pkg.add(
    url="https://github.com/gdalle/DifferentiationInterface.jl",
    subdir="DifferentiationInterface"
)

Example

using DifferentiationInterface
import ForwardDiff, Enzyme, Zygote  # AD backends you want to use 

f(x) = sum(abs2, x)

x = [1.0, 2.0]

value_and_gradient(f, AutoForwardDiff(), x) # returns (5.0, [2.0, 4.0]) with ForwardDiff.jl
value_and_gradient(f, AutoEnzyme(),      x) # returns (5.0, [2.0, 4.0]) with Enzyme.jl
value_and_gradient(f, AutoZygote(),      x) # returns (5.0, [2.0, 4.0]) with Zygote.jl

To improve your performance by up to several orders of magnitude compared to this example, take a look at the DifferentiationInterface tutorial and its section on operator preparation.

Citation

Please cite both DifferentiationInterface.jl and its inspiration AbstractDifferentiation.jl, using the provided CITATION.bib file.