Percival.jl - An augmented Lagrangian solver
Percival is an implementation of the augmented Lagrangian solver described in
S. Arreckx, A. Lambe, Martins, J. R. R. A., & Orban, D. (2016). A Matrix-Free Augmented Lagrangian Algorithm with Application to Large-Scale Structural Design Optimization. Optimization And Engineering, 17, 359–384. doi:10.1007/s11081-015-9287-9
How to Cite
If you use Percival.jl in your work, please cite using the format given in CITATION.bib.
] to enter
pkg> mode of Julia, then
pkg> add Percival
Consider the following 2-dimensional optimization problem with an equality constraint
You can solve an JuMP model
model by using NLPModelsJuMP.jl to convert it.
using JuMP, NLPModelsJuMP, Percival model = Model() @variable(model, x[i=1:2], start = [-1.2; 1.0][i]) @NLobjective(model, Min, (x - 1)^2 + 100 * (x - x^2)^2) @NLconstraint(model, x^2 + x^2 == 1) nlp = MathOptNLPModel(model) # thin wrapper converting JuMP Model as NLPModel output = percival(nlp, verbose = 1)
percival accept as input any instance of
AbstractNLPModel, for instance, using automatic differentiation via ADNLPModels.jl to solve the same problem.
using ADNLPModels, Percival nlp = ADNLPModel( x -> (x - 1)^2 + 100 * (x - x^2)^2, [-1.2; 1.0], x -> [x^2 + x^2], [1.0], [1.0], ) output = percival(nlp, verbose = 1)
Bug reports and discussions
If you think you found a bug, feel free to open an issue. Focused suggestions and requests can also be opened as issues. Before opening a pull request, start an issue or a discussion on the topic, please.
If you want to ask a question not suited for a bug report, feel free to start a discussion here. This forum is for general discussion about this repository and the JuliaSmoothOptimizers, so questions about any of our packages are welcome.