MixedLRMoE.jl is an implementation of the Mixed Logit-Reduced Mixture-of-Experts (Mixed LRMoE) model in julia
.
This theoretical development of the Mixed LRMoE is given in Fung and Tseung (2022+), while an application in the automobile insurance context is given in Tseung et al. (2023).
To install the stable version of the package, simply type the following in the julia
REPL:
] add MixedLRMoE
To install the latest version, type the following in the julia
REPL:
] add https://github.com/sparktseung/MixedLRMoE.jl
The website of full documentation is here.