Further algorithm detail

This page describes the steps used in our algorithm in a bit more detail.

Given an unnormalised likelihood, f(y;θ), with intractable normalising constant, Z(θ) = ∫ f(y;θ) dy, we can rewrite the likelihood as a series expansion using some simple manipulation 

likelihood series



This infinite series is convergent if |κ(θ)|<1. Each term in the sum can be estimated unbiasedly using n independent unbiased estimates of κ(θ) which we can get via importance sampling. 

The series can be truncated unbiasedly in a number of ways, the simplest of which is to draw an integer, k, such that ∑ p(k = K) = 1 for K = {0, 1, … ∞} and then an estimate of the sum is given by

poisson estimate

Note that the variance of this estimate may be infinite depending on the choice of p(k).

Finally, we make use of a result from the QCD literature which means that estimates of the likelihood made from the above technique, even negative estimates, can be used in exact-approximate MCMC to obtain Monte Carlo estimates of expectations with respect to the exact target. The absolute value of the estimate is used in the Metropolis-Hastings acceptance ratio and then the samples are corrected once the MCMC chain has run.