DistributedHyperOpt.jl is a package similar to HyperOpt.jl, but explicitly focusing on distributed (multi-processing) hyperparameter optimization by design.
1. Open a Julia-REPL, switch to package mode using ], activate your preferred environment.
2. Install DistributedHyperOpt.jl:
(@v1.X) pkg> add "https://github.com/ThummeTo/DistributedHyperOpt.jl"3. If you want to check that everything works correctly, you can run the tests bundled with DistributedHyperOpt.jl:
(@v1.X) pkg> test DistributedHyperOpt4. See the folder examples (or testing scripts) for examples.
| Max. processes | |
|---|---|
| Random Sampler | unlimited |
| Hyperband (using Random Sampler) | num. brackets (s_max+1) |