## DiffOpt.jl

Differentiating convex optimization program w.r.t. program parameters
Author jump-dev
Popularity
47 Stars
Updated Last
2 Years Ago
Started In
May 2020

# DiffOpt.jl

Differentiating convex optimization program (`JuMP.jl` or `MathOptInterface.jl` models) with respect to program parameters. Currently supports LPs, QPs.

## Installation

DiffOpt can be installed through the Julia package manager:

``````(v1.3) pkg> add https://github.com/jump-dev/DiffOpt.jl
``````

## Usage

Create a differentiable model from existing optimizers:

```using DiffOpt
using GLPK
using MathOptInterface
const MOI = MathOptInterface

diff = diff_optimizer(GLPK.Optimizer)```

Update and solve the model:

```x = MOI.add_variables(diff, 2)

MOI.optimize!(diff)```

Finally, differentiate the model (primal and dual variables specifically) to obtain product of jacobians with respect to problem parameters and a backward pass vector.

Currently, DiffOpt supports two backends. If the optimization problem is of quadratic form i.e.

``````minimize_z z^T Q z / 2 + q^T z
subject to: Az = b,
Gz ≤ h
``````

then one can compute gradients by providing a backward pass vector

```bpv = [1.0, 1.0]
grads = backward(diff, ["Q", "q", "h"], bpv)```

Secondly, for a conic problem of the format:

``````minimize_x c^T x
subject to: Ax + b in K
``````

where

• the objective is linear
• `K` is a Cartesian product of linear, semidefinite, second-order cones then one can compute gradients by providing perturbations
`grads = backward(diff, dA, db, dc)`