/DiffOpt.jl

Differentiating convex optimization program w.r.t. program parameters

Primary LanguageJuliaMIT LicenseMIT

DiffOpt.jl

Dev Build Status Coverage

DiffOpt is a package for differentiating convex optimization programs with respect to the program parameters. It currently supports linear, quadratic and conic programs. Refer to the documentation for examples. Powered by JuMP.jl, DiffOpt allows creating a differentiable optimization model from many existing optimizers.

Installation

DiffOpt can be installed via the Julia package manager:

julia> ]
(v1.7) pkg> add DiffOpt

Example

  1. Create a model using the wrapper.
using JuMP
import DiffOpt
import HiGHS

model = JuMP.Model(() -> DiffOpt.diff_optimizer(HiGHS.Optimizer))
  1. Define your model and solve it a single line.
@variable(model, x)
@constraint(
  model,
  cons,
  x >= 3,
)
@objective(
  model,
  Min,
  2x,
)

optimize!(model) # solve
  1. Choose the problem parameters to differentiate with and set their perturbations.
MOI.set.(  # set pertubations / gradient inputs
    model, 
    DiffOpt.BackwardInVariablePrimal(),
    x,
    1.0,
)
  1. Differentiate the model (primal, dual variables specifically) and fetch the gradients
DiffOpt.backward(model) # differentiate

grad_exp = MOI.get(   # -3 x - 1
    model,
    DiffOpt.BackwardOutConstraint(),
    cons
)
JuMP.constant(grad_exp)  # -1
JuMP.coefficient(grad_exp, x)  # -3

Note