A Julia package for using and writing powerful, extensible training loops for deep learning models.
What does it do?
- Implements a training loop to take the boilerplate out of training deep learning models
- Lets you add features to training loops through reusable callbacks
- Comes with callbacks for many common use cases like hyperparameter scheduling, metrics tracking and logging, checkpointing, early stopping, and more...
- Is extensible by creating custom, reusable callbacks or even custom training loops
When should you use FluxTraining.jl?
- You don't want to implement your own metrics tracking and hyperparameter scheduling or insert common training feature here for the 10th time
- You want to use composable and reusable components that enhance your training loop
- You want a simple training loop with reasonable defaults that can grow to the needs of your project
How do you use it?
Install like any other Julia package using the package manager:
learner = Learner(model, lossfn)
fit!(learner, 10, (trainiter, validiter))
Next, you may want to read
- Getting started
- A full example training an image classifier on the MNIST dataset
- The documentation of FastAI.jl which features many end-to-end examples
The design of FluxTraining.jl's two-way callbacks is adapted from fastai's training loop.