DiffOpt.jl

Differentiating convex optimization program w.r.t. program parameters
Author AKS1996
Popularity
13 Stars
Updated Last
6 Months Ago
Started In
May 2020

DiffOpt.jl

Build Status Coverage Status

Differentiating convex optimization program (JuMP.jl or MathOptInterface.jl models) with respect to program parameters. Currently supports LPs, QPs.

Installation

DiffOpt can be installed through the Julia package manager:

(v1.3) pkg> add https://github.com/AKS1996/DiffOpt.jl

Usage

Create a differentiable model from an existing MathOptInterface.jl model

    using DiffOpt
    
    ...
    
    diff = diff_model(model)

Solve the model with any of the existing optimizers

= diff.forward()

Finally differentiate the model (primal and dual variables specifically) to obtain their jacobians with respect to problem data

    grads = diff.backward(["Q", "q", "h"])

Note