Parallel Computing Packages
-
AnyMOD.jl55Julia framework for energy system models with a focus on multi-period capacity expansion
-
DistributedArrays.jl186Distributed Arrays in Julia
-
Dagger.jl502A framework for out-of-core and parallel execution
-
ClusterManagers.jl205-
-
MPI.jl320MPI wrappers for Julia
-
Hwloc.jl63A Julia API for hwloc
-
ScaLAPACK.jl5Wrap ScaLAPACK in Julia
-
FLoops.jl284Fast sequential, threaded, and distributed for-loops for Julia—fold for humans™
-
PencilArrays.jl48Distributed Julia arrays using the MPI protocol
-
DiffEqGPU.jl202GPU-acceleration routines for DifferentialEquations.jl and the broader SciML scientific machine learning ecosystem
-
ParallelStencil.jl238Package for writing high-level code for parallel high-performance stencil computations that can be deployed on both GPUs and CPUs
-
Gaius.jl113Divide and Conquer Linear Algebra
-
CheapThreads.jl169The cheapest threads you can find!
-
ParallelAccelerator.jl295The ParallelAccelerator package, part of the High Performance Scripting project at Intel Labs
-
AMDGPUnative.jl56Julia interface to AMD/Radeon GPUs
-
Blocks.jl30A framework to represent chunks of entities and parallel methods on them.
-
Slurm.jl3Experimental Julia interface to slurm.schedmd.com
-
MessageUtils.jl6Channels(), tspaces(), kvspaces() and more
-
Elly.jl44Hadoop HDFS and Yarn client
-
FoldsCUDA.jl50Data-parallelism on CUDA using Transducers.jl and for loops (FLoops.jl)
-
HDFS.jl23HDFS interface for Julia as a wrapper over Hadoop HDFS library.
-
Dispatcher.jl46Build, distribute, and execute task graphs
-
Heptapus.jl8-
-
PTools.jl8Collection of utilities for parallel processing in Julia
-
DispatcherCache.jl1Adaptive persistency-based mechanism for Dispatcher task graphs
-
ParallelGLM.jl11Parallel fitting of GLMs using SharedArrays
-
Flume.jl1A port of the Google Flume Data-Parallel Pipeline system to Julia
View all packages