ParameterJuMP.jl

A JuMP extension to use parameter in constraints RHS
Popularity
41 Stars
Updated Last
3 Months Ago
Started In
January 2018

ParameterJuMP.jl

Build Status Social
Build Status Gitter
Codecov branch

A JuMP extension to use parameters in constraints constants.

Welcome to ParameterJuMP

ParameterJuMP adds new methods created on top of JuMP to use constant parameters in optimization problems.

To construct a parameter, pass Param() as the variable-type argument to @variable:

@variable(model, p == 1, Param())
@variable(model, p[i = 1:3] == i, Param())
anon = @variable(model, variable_type = Param())

It is possible to change the current value of a parameter with the function:

set_value(p::ParameterRef, new_value::Number)

Query the current value of the parameter with:

value(p::ParameterRef)

Finally, the dual function of JuMP is overloaded to return duals for parameters:

dual(p::ParameterRef)

Last but not least! The parameter algebra was implemented so that is possible to:

  • sum two parameters
  • multiply parameters by constants
  • sum parameters and variables
  • sum parameters and affine expressions

All the operations related to linear constraints are implemented.

Simple example

Lets use JuMP plus ParameterJuMP to solve the optimization problem:

min   x
s.t.  x >= a

where x is a variable and a is a constant. We can also solve it for different values of a.

# Create a JuMP model able to handle parameters
model = Model(SOME_SOLVER.Optimizer)

# Create a regular JuMP variable
@variable(model, x)

# Create a parameter fixed at 10
@variable(model, a == 10, Param())

# adds a constraint mixing variables and parameters to the model
@constraint(model, x >= a)

# solve the model
optimize!(model)

# query dual variable of the constant a
dual(a)

# modify the value of the parameter a to 20
set_value(a, 20)

# solve the model with the new value of the parameter
optimize!(model)

Installation

Currently ParameterJuMP works with Julia 1.x and JuMP 0.21.x

import Pkg; Pkg.add("ParameterJuMP")

Motivation

Suppose we have linear programming problem of the following form

The only decision variable in the problem is . On the other hand, is a mere parameter.

Problems like this appear frequently in Stochastic optimization and in Decomposition frameworks.

In stochastic optimization it is frequent to solve that same problem for multiple values of , which are tipically scenario dependent.

In decomposition frameworks, it is useful to solve the same problem for multiple values of , but even more important is to be able to query dual values from . This dual values are computed by applying the chain rule on the duals of the constraints.

Pure JuMP version

In pure JuMP we can acomplish these tasks by creating dummy fixed variables, so that we can easily change their fixed values and query duals from fixing constraints.

One example in pure JuMP goes as follows:

# create a regular JuMP Model
model_pure = Model(SOME_SOLVER.Optimizer)

# add optimization variables
@variable(model_pure, x[1:N] >= 0)

# add dummy fixed variables
@variable(model_pure, y[1:M])
@variable(model_pure, y_fixed[1:M] == value_for_y[i])
@constraint(model_pure, fix_y[j in 1:M], y[i] == y_fixed[i])

# add constraints
@constraint(model_pure, ctr[k in 1:P],
    sum(A[i,k]*x[i] for i in 1:N) == b[k] - sum(D[j,k]*y[j] for j in 1:M))

# create objective function
@objective(model_pure, Min, sum(c[i]*x[i] for i in 1:N))

# solve problem
optimize!(model_pure)

# query dual values
y_duals = dual.(fix_y)

# modify y
set_value.(y_fixed, new_value_for_y)

# solve problem (again)
optimize!(model_pure)

# query dual values (again)
y_duals = dual.(fix_y)

The main problem with this approach is that it creates to many dummy variables that are added without real need to the solver representation of the optimization problem. Hence solve times are increased without real need!!!

ParameterJuMP version

The same example of the motivation can be written with parameters:

# create a ParameterJuMP Model
model_param = Model(SOME_SOLVER.Optimizer)

# add optimization variables
@variable(model_param, x[1:N] >= 0)

# add dummy fixed variables
@variable(model, y[i = 1:M] == value_for_y[i], Param())

# add constraints
@constraint(model_param, ctr[k in 1:P],
    sum(A[i,k]*x[i] for i in 1:N) == b[k] - sum(D[j,k]*y[j] for j in 1:M))

# create objective function
@objective(model_param, Min, sum(c[i]*x[i] for i in 1:N))

# solve problem
optimize!(model_param)

# query dual values
y_duals = dual.(y)

# modify y
set_value.(y, new_value_for_y)

# solve problem (again)
optimize!(model_param)

# query dual values (again)
y_duals = dual.(y)

Acknowledgments

ParameterJuMP was developed by:

  • Joaquim Dias Garcia (@joaquimg), PSR and PUC-Rio
  • Benoît Legat (@blegat), UCLouvain