This package implements the optimization methods described in Wierstra, et al "Natural Evolution Strategies", JMLR (2014). this implementation follows the KISS™ principle, it can be used as
function rosenbrock(x::AbstractVector{T}) where T
s=(1.0 - x[1])^2
for i in 1:(length(x)-1)
s+=100.0 * (x[i+1] - x[i]^2)^2
end
return s
end
optimize(rosenbrock,[0.3,0.6],1.0,sNES) # separable natural es.
(sol = [0.9999902815083116, 0.9999805401026993], cost = 9.450201922031972e-11)
optimize(rosenbrock,[0.3,0.6],1.0,xNES) # exponential natural es.
(sol = [0.9999999934969991, 0.9999999871800216], cost = 4.574949214506023e-17)
for further info in Julia type ?optimize
, to see
minimizes the function f
according to:
`f` : function to optimize
μ::Vector -> cost::Real
`μ` : initial condition
μ::Vector
`σ` : initial uncertainty on μ
σ::{Real | Vector | Matrix}
`method` : xNES or sNES
xNES = exponential evolution strategies, expensive but powerful on non separable objective
sNES = separable evolution strategies, lightweight very powerful for separable or very high dimensional objectives
`options` :
ημ = learning rate for μ,
ησ = learning rate for uncertainties,
atol = tolerance on uncertainties (default 1e-8),
samples = number of samples used to build Natural Gradient approximation,
iterations = upper limit on the number of iterations, default 10^4)
- Use xNES for hard problems with strongly correlated variables
- Use sNES for high dimensional problems that exhibit many local minima
- Use sNES for problems with mostly separable variables
look at the excellent BlackBoxOptim
, or Optim