Hidden Markov Models for Julia.
HMMBase is not maintained anymore. It will keep being available as a Julia package but we encourage existing and new users to migrate to HiddenMarkovModels.jl which offers a similar interface. For more information see HiddenMarkovModels.jl: when did HMMs get so fast?.
HMMBase provides a lightweight and efficient abstraction for hidden Markov models in Julia. Most HMMs libraries only support discrete (e.g. categorical) or Normal distributions. In contrast HMMBase builds upon Distributions.jl to support arbitrary univariate and multivariate distributions.
See HMMBase.jl - A lightweight and efficient Hidden Markov Model abstraction for more details on the motivation behind this package.
Benchmark of HMMBase against hmmlearn and pyhsmm.
Features:
- Supports any observation distributions conforming to the Distribution interface.
- Fast and stable implementations of the forward/backward, EM (Baum-Welch) and Viterbi algorithms.
Non-features:
- Multi-sequences HMMs, see MS_HMMBase
- Bayesian models, probabilistic programming, see Turing
- Nonparametric models (HDP-H(S)MM, ...)
The package can be installed with the Julia package manager.
From the Julia REPL, type ]
to enter the Pkg REPL mode and run:
pkg> add HMMBase
- STABLE — documentation of the most recently tagged version.
- DEVEL — documentation of the in-development version.
The package is tested against Julia 1.0 and the latest Julia 1.x.
Starting with v1.0, we follow semantic versioning:
Given a version number MAJOR.MINOR.PATCH, increment the:
- MAJOR version when you make incompatible API changes,
- MINOR version when you add functionality in a backwards compatible manner, and
- PATCH version when you make backwards compatible bug fixes.
Contributions are very welcome, as are feature requests and suggestions. Please read the CONTRIBUTING.md file for informations on how to contribute. Please open an issue if you encounter any problems.
Logo: lego by jon trillana from the Noun Project.