Implementation of Transitional Markov Chain Monte Carlo (TMCMC) in Julia.
Author AnderGray
12 Stars
Updated Last
1 Year Ago
Started In
January 2021


Build Status Coverage Status DOI

Implementation of Transitional Markov Chain Monte Carlo (TMCMC) in Julia. This implementation is heavily inspired by the implemntation of TMCMC in OpenCOSSAN.

The TMCMC algorithm can be used to sample from un-normalised probability density function (i.e. posterior distributions in Bayesian Updating). The TMCMC algorithm overcomes some of the issues with Metropolis Hastings:

  • Can efficiently sample multimodal distributions
  • Works well in high dimensions (within reason)
  • Computes the evidence
  • Proposal distribution selected by algorithm
  • Easy to parallelise

Instead of atempting to directly sample from the posterior, TMCMC samples from easy-to-sample "transitional" distributions. Defined by:

$P^j \propto P(\mathbf{D}|\theta)^{\beta_j} \cdot P(\theta)$

where $0 \leq \beta_j \leq 1$, is iterated in the algorithm starting from $\beta_j = 0$ (prior) to $\beta_j = 1$ (posterior).


This is a registered Julia package:

julia> ]
pkg> add TransitionalMCMC


Sampling Himmelblau's Function:

using StatsBase, Distributions, PyPlot
using TransitionalMCMC

# Prior Bounds
lb  = -5        
ub  = 5

# Prior log Density and sampler
logprior(x) = logpdf(Uniform(lb,ub), x[1]) + logpdf(Uniform(lb,ub), x[2])
priorRnd(Nsamples) = rand(Uniform(lb,ub), Nsamples, 2)

# Log Likelihood
logLik(x) = -1 * ((x[1]^2 + x[2] - 11)^2 + (x[1] + x[2]^2 - 7)^2)

samps, Log_ev = tmcmc(logLik, logprior, priorRnd, 2000)

plt.scatter(samps[:,1], samps[:,2])

For parallel excution

using Distributed, StatsBase, Distributions, PyPlot

addprocs(4; exeflags="--project")
@everywhere begin
    using TransitionalMCMC

    # Prior Bounds
    lb, ub  = -5, 5

    # Prior log Density and sampler
    logprior(x) = logpdf(Uniform(lb, ub), x[1]) + logpdf(Uniform(lb, ub), x[2])
    priorRnd(Nsamples) = rand(Uniform(lb, ub), Nsamples, 2)

    # Log Likelihood
    function logLik(x)
        return -1 * ((x[1]^2 + x[2] - 11)^2 + (x[1] + x[2]^2 - 7)^2)


Nsamples = 2000

samps, Log_ev = tmcmc(logLik, logprior, priorRnd, Nsamples, 5, 2)


Found in /slurm_benchmarks

Testing scalability of tmcmcHimmelblau.jl with different model evaluations times

Testing slowdown and iteration number for various dimensions. Target is a mixture of 2 Gaussians in N dimensions, with means located at [-5, -5 , ...] and [5, 5, ...]


  • Plotting functions
  • Storing samples across iterations
  • FE example


Used By Packages

No packages found.