LRMoE.jl

LRMoE implemented in Julia
Author UofTActuarial
Popularity
7 Stars
Updated Last
7 Months Ago
Started In
October 2020

LRMoE.jl

LRMoE.jl is an implementation of the Logit-Reduced Mixture-of-Experts model in julia. This package is introduced in Tseung et al. 2021.

To install the stable version of the package, simply type the following in the julia REPL:

] add LRMoE

To install the latest version, type the following in the julia REPL:

] add https://github.com/sparktseung/LRMoE.jl

The website of full documentation is here.