CIAOAlgorithms.jl

Coordinate and Incremental Aggregated Optimization Algorithms
Author kul-optec
Popularity
10 Stars
Updated Last
5 Months Ago
Started In
June 2020

CIAOAlgorithms.jl

Build status codecov

CIAOAlgorithms implements Block-Coordinate and Incremental Aggregated Optimization Algorithms for minimizations of the form

$$\text{minimize } \frac{1}{N} \sum_{i=1}^N f_i(x) + g(x)$$

or $$\text{minimize } \frac{1}{N} \sum_{i=1}^N f_i(x_i) + g\big(\sum_{i=1}^N x_i\big)$$ where f_i are smooth, and g is (possibly) nonsmooth with easy to compute proximal mapping. These functions can be defined using the ProximalOperators.jl package.

Quick guide

You can add CIAOAlgorithms by pressing ] to enter the package manager, then

pkg> add CIAOAlgorithms

Simple Lasso and logisitc regression test examples can be found here.

Implemented Algorithms

Algorithm Function Reference
Finito/MISO/DIAG Finito [1], [4], [8], [9]
ProShI Proshi [9]
SAGA SAGA [3], [6]
SAG SAG [7]
SVRG/SVRG++ SVRG [2], [4], [5]

References

[1] Defazio, Domke, Finito: A faster, permutable incremental gradient method for big data problems, In International Conference on Machine Learning pp. 1125-1133 (2014).

[2] Xiao, Zhang, A proximal stochastic gradient method with progressive variance reduction, SIAM Journal on Optimization 24(4):2057–2075 (2014).

[3] Defazio, Bach, Lacoste-Julien, SAGA: A fast incremental gradient method with support for non-strongly convex composite objectives, In: Advances in neural information processing systems, pp. 1646–1654 (2014).

[4] Mairal, Incremental majorization-minimization optimization with application to large-scale machine learning SIAM Journal on Optimization 25(2), 829–855 (2015).

[5] Allen-Zhu, Yuan, Improved SVRG for non-strongly-convex or sum-of-non-convex objectives In Proceedings of the 33rd International Conference on Machine Learning 1080–1089 (2016).

[6] Reddi, Sra, Poczos, and Smola, Proximal stochastic methods for nonsmooth nonconvex finite-sum optimization In Advances in Neural Information Processing Systems, pp. 1145–1153 (2016).

[7] Schmidt, Le Roux, Bach, Minimizing finite sums with the stochastic average gradient Mathematical Programming, 162(1-2), 83-112 (2017).

[8] Mokhtari, Gürbüzbalaban, Ribeiro, Surpassing gradient descent provably: A cyclic incremental method with linear convergence rate SIAM Journal on Optimization 28(2), 1420–1447 (2018).

[9] Latafat, Themelis, Patrinos, Block-coordinate and incremental aggregated proximal gradient methods for nonsmooth nonconvex problems arXiv:1906.10053 (2019).

Used By Packages

No packages found.