Machine Learning Packages
-
MLJ.jl1779A Julia machine learning framework
-
Knet.jl1427Koç University deep learning framework.
-
BrainFlow.jl1273BrainFlow is a library intended to obtain, parse and analyze EEG, EMG, ECG and other kinds of data from biosensors
-
TensorFlow.jl884A Julia wrapper for TensorFlow
-
DiffEqFlux.jl861Pre-built implicit layer architectures with O(1) backprop, GPUs, and stiff+non-stiff DE solvers, demonstrating scientific machine learning (SciML) and physics-informed machine learning methods
-
FastAI.jl589Repository of best practices for deep learning in Julia, inspired by fastai
-
ScikitLearn.jl546Julia implementation of the scikit-learn API https://cstjean.github.io/ScikitLearn.jl/dev/
-
Lux.jl479Elegant & Performant Scientific Machine Learning in Julia
-
Enzyme.jl438Julia bindings for the Enzyme automatic differentiator
-
StatisticalRethinking.jl386Julia package with selected functions in the R package `rethinking`. Used in the SR2... projects.
-
MXNet.jl371MXNet Julia Package - flexible and efficient deep learning in Julia
-
AutoMLPipeline.jl355A package that makes it trivial to create and evaluate machine learning pipeline architectures.
-
Clustering.jl353A Julia package for data clustering
-
DecisionTree.jl351Julia implementation of Decision Tree (CART) and Random Forest algorithms
-
Metalhead.jl328Computer vision models for Flux
-
Dojo.jl307A differentiable physics engine for robotics
-
SimpleChains.jl234Simple chains
-
MLDatasets.jl227Utility package for accessing common Machine Learning datasets in Julia
-
GraphNeuralNetworks.jl218Graph Neural Networks in Julia
-
AbstractGPs.jl217Abstract types and methods for Gaussian Processes.
-
Torch.jl211Sensible extensions for exposing torch in Julia.
-
ReservoirComputing.jl206Reservoir computing utilities for scientific machine learning (SciML)
-
MLBase.jl186A set of functions to support the development of machine learning algorithms
-
EvoTrees.jl175Boosted trees in Julia
-
MLJBase.jl160Core functionality for the MLJ machine learning framework
-
Yota.jl158Reverse-mode automatic differentiation in Julia
-
ForneyLab.jl149Julia package for automatically generating Bayesian inference algorithms through message passing on Forney-style factor graphs.
-
LossFunctions.jl147Julia package of loss functions for machine learning.
-
MLJFlux.jl145Wrapping deep learning models from the package Flux.jl for use in the MLJ.jl toolbox
-
Merlin.jl144Deep Learning for Julia
-
AugmentedGaussianProcesses.jl135Gaussian Process package based on data augmentation, sparsity and natural gradients
-
ConformalPrediction.jl135Predictive Uncertainty Quantification through Conformal Prediction for Machine Learning models trained in MLJ.
-
FluxTraining.jl119A flexible neural net training library inspired by fast.ai
-
CounterfactualExplanations.jl117A package for Counterfactual Explanations and Algorithmic Recourse in Julia.
-
MachineLearning.jl116Julia Machine Learning library
-
MIPVerify.jl113Evaluating Robustness of Neural Networks with Mixed Integer Programming
-
TSML.jl112A package for time series data processing, classification, clustering, and prediction.
-
MLUtils.jl107Utilities and abstractions for Machine Learning tasks
-
Avalon.jl106Starter kit for legendary models
-
ExplainableAI.jl106Explainable AI in Julia.
Loading more...