DifferentiationInterface.jl

An interface to various automatic differentiation backends in Julia.
Author gdalle
Popularity
163 Stars
Updated Last
2 Months Ago
Started In
January 2024

DifferentiationInterface Logo

DifferentiationInterface

Build Status Coverage Code Style: Blue ColPrac: Contributor's Guide on Collaborative Practices for Community Packages DOI

Package Docs
DifferentiationInterface Stable Dev
DifferentiationInterfaceTest Stable Dev

An interface to various automatic differentiation (AD) backends in Julia.

Goal

This package provides a unified syntax to differentiate functions.

Features

  • First- and second-order operators (gradients, Jacobians, Hessians and more)
  • In-place and out-of-place differentiation
  • Preparation mechanism (e.g. to create a config or tape)
  • Built-in sparsity handling
  • Thorough validation on standard inputs and outputs (numbers, vectors, matrices)
  • Testing and benchmarking utilities accessible to users with DifferentiationInterfaceTest

Compatibility

We support all of the backends defined by ADTypes.jl:

Note that in some cases, going through DifferentiationInterface.jl might be slower than a direct call to the backend's API. This is mostly true for Enzyme.jl, whose handling of activities and multiple arguments unlocks additional performance. We are working on this challenge, and welcome any suggestions or contributions. Meanwhile, if differentiation fails or takes too long, consider using Enzyme.jl directly.

Installation

To install the stable version of the package, run the following code in a Julia REPL:

using Pkg

Pkg.add("DifferentiationInterface")

To install the development version, run this instead:

using Pkg

Pkg.add(
    url="https://github.com/gdalle/DifferentiationInterface.jl",
    subdir="DifferentiationInterface"
)

Example

using DifferentiationInterface
import ForwardDiff, Enzyme, Zygote  # AD backends you want to use 

f(x) = sum(abs2, x)

x = [1.0, 2.0]

value_and_gradient(f, AutoForwardDiff(), x) # returns (5.0, [2.0, 4.0]) with ForwardDiff.jl
value_and_gradient(f, AutoEnzyme(),      x) # returns (5.0, [2.0, 4.0]) with Enzyme.jl
value_and_gradient(f, AutoZygote(),      x) # returns (5.0, [2.0, 4.0]) with Zygote.jl

To improve your performance by up to several orders of magnitude compared to this example, take a look at the DifferentiationInterface tutorial and its section on operator preparation.

Citation

Please cite both DifferentiationInterface.jl and its inspiration AbstractDifferentiation.jl, using the provided CITATION.bib file.