| Documentation | Linux/macOS/Windows | Coverage | DOI |
|---|---|---|---|
If you use KnetNLPModels.jl in your work, please cite using the format given in CITATION.bib.
Julia ≥ 1.6.
This module can be installed with the following command:
pkg> add KnetNLPModels
pkg> test KnetNLPModelsKnetNLPModels is an interface between Knet.jl's classification neural networks and NLPModels.jl.
A KnetNLPModel gives the user access to:
- the values of the neural network variables/weights
w; - the value of the objective/loss function
L(X, Y; w)atwfor a given minibatch(X,Y); - the gradient
∇L(X, Y; w)of the objective/loss function atwfor a given minibatch(X,Y).
In addition, it provides tools to:
- switch the minibatch used to evaluate the neural network;
- change the minibatch size;
- measure the neural network's accuracy at the current
w.
Check the tutorial.
If you use KnetNLPModels.jl in your work, please cite using the format given in CITATION.bib.
If you think you found a bug, feel free to open an issue. Focused suggestions and requests can also be opened as issues. Before opening a pull request, start an issue or a discussion on the topic, please.
If you want to ask a question not suited for a bug report, feel free to start a discussion here. This forum is for general discussion about this repository and the JuliaSmoothOptimizers, so questions about any of our packages are welcome.