LRMoE.jl is an implementation of the Logit-Reduced Mixture-of-Experts model in julia
.
This package is introduced in Tseung et al. 2021.
To install the stable version of the package, simply type the following in the julia
REPL:
] add LRMoE
To install the latest version, type the following in the julia
REPL:
] add https://github.com/sparktseung/LRMoE.jl
The website of full documentation is here.