SMC sampler

Sequential Monte Carlo (SMC) methods are usually designed to deal with probability distributions of increasing dimensions. The canonical example is the sequence of distributions {\pi_n(X_{1:n}) = \mathbb{P}(X_{1:n} \, | Y_{1:n})} where {\{ X_k \}_{k \geq 0}} is a hidden Markov chain and {\{ Y_k \}_{k \geq 0}} is an observation process. The notation {X_{1:n}} is a shortcut for {(X_1, X_2, \ldots, X_n)}. This is usually called a Hidden Markov Model in the literature. If the Markov chain {X_k} evolves on in {\mathbb{R}^d} then the distribution {\pi_n} is a probability on {\big( \mathbb{R}^d \big)^n}.

It turns out that one can use the same kind of ideas to study a single distribution {\pi}. If {\pi} is difficult to sample directly from and that it is difficult to use traditional MCMC methods (because of the dimensionality, multi-modality, poor mixing properties, etc…), one can try to use SMC methods to study the distribution {\pi}. To this end, one can introduce a sequence of bridging distributions

\displaystyle \begin{array}{rcl} \pi_0, \pi_1, \ldots, \pi_n \end{array}

where {\pi_0} is typically very easy to sample from and {\pi_n=\pi}. A traditional choice is {\pi_k(x) \propto \pi(x)^{\alpha_k}} where {0 \leq \alpha_1 < \alpha_2 < \ldots <\alpha_n=1}. The bigger the index {k}, the more intricate the distribution {\pi_k} is. The problem is that each distribution {\pi_k} has the same dimensionality as the original distribution {\pi} so that it is difficult to use all the machinery of SMC methods. For the sake of concreteness, let us suppose that {\pi} is a distribution on {\mathbb{R}^d}. A fundamental remark that is the base of what is now known as Sequential Monte Carlo samplers is that {\pi_k(x_k)} can be seen as the marginal distribution of the probability

\displaystyle \begin{array}{rcl} \Pi_k(x_1, \ldots, x_k) = \pi_k(x_k) L_k(x_k, x_{k-1})L_{k-1}(x_{k-1}, x_{k-2}) \ldots L_2(x_2,x_1) \end{array}

where {L_j(\cdot, \cdot)} are Markov kernels in the sense that for any {x} we have {\int_y L_j(x, y) \, dy = 1}. The distribution {\Pi_k} lives on {\big( \mathbb{R}^d \big)^k} and one can use the traditional SMC methods to explore the sequence of distributions {\Pi_1, \ldots, \Pi_n}. This algorithm was devised a few years ago by Pierre Del Moral, Arnaud Doucet and Ajay Jasra. It is extremely powerful and several other algorithms can be seen as particular cases of it. Because I had never implemented the algorithm before, I tried yesterday to see how this worked on a very simple example. My target distribution was an equal mixture of seven Gaussian distributions in {\mathbb{R}^2} with unit covariance matrices and centred at the vertices of a regular heptagon with radius {20}. I started with {N=3000} particles at the centre of the heptagon and let them evolved according to a random walk. When the Effective Sampling Size (ESS) was too small (less than {N/2}), the whole particle system was regenerated. In order to obtain a sweet movie, I used {100} bridge distributions {\pi_k \propto \pi^{k/100}}.

The result is clear and quite impressive. Indeed, this could have been made much more efficient by taking much less bridge distributions. Some recent theory has been developed to analyse the stability of the algorithm in high dimensional situations. The analysis described in the article uses a scaling limit approach very similar to some ideas used to analyse MCMC algorithms and some more recent developements along the same lines that we have done with Andrew and Natesh. Very exciting stuff!

[EDIT: 2 October 2012]
I received several emails asking me to describe how to obtain the short animation showing the evolution of the particles system. The code is available here. This is a quick python script producing a bunch of .png files. It then suffices to stick them together to obtain an animated .gif file. This can for example easily be done using the free (and very good) photoshop-like software GIMP. Open all the png files as different layers. Export the result as a .gif file with the option ‘animated gif’ checked. Voila!

5 Comments

  1. October 1, 2012 at 6:48 pm

    Very cool. Do you mind sharing how you implemented the algorithm and made that movie?

  2. David C. said,

    November 22, 2013 at 9:47 pm

    Really great animation there. I am just learning about SMC methods, and it seems the code you posted on CodeViewer has expired. Would you mind sharing it again?

  3. May 12, 2014 at 12:33 pm

    […] the bottleneck. In the case of the particle filter (or any sequential Monte Carlo method such as SMC samplers), that bottleneck is the resampling step. The article investigates this issue and numerically […]

  4. Tim Reynolds said,

    January 5, 2017 at 10:41 pm

    Any chance you can repost that code? I really like the animation.

  5. alek said,

    January 6, 2017 at 3:09 am

    Hi Tim,
    I just found back this very old piece of Python code. It is pretty ugly, but here it is:


    # LIBRARY IMPORT
    from numpy import *
    from random import *
    import pylab as pylab
    from math import *
    #
    # BRIDGING DENSITY starting from the flat density
    #
    def density_bridge(x,y,center_coord, std, k,N):
    if k == 0:
    return 1.0
    return pow(density(x,y, center_coord, std), k/N)
    #
    # mixture of Gaussian: center_coord = centres of the Gaussians
    #
    def density(x,y, center_coord, std):
    N = len(center_coord)
    likelihood = 0
    for c_x,c_y in center_coord:
    likelihood += exp(-(pow(x-c_x,2) + pow(y-c_y,2))/(2*std*std))
    return likelihood
    #
    # 2-dimensional random walk proposals
    #
    def proposal(x,y,delta):
    return (gauss(x,delta), gauss(y,delta))
    #
    # compute Effective Sampling Size (ESS)
    #
    def compute_ESS(weights):
    ess = sum([w*w for w in weights])
    return 1.0/ess
    def resampling(particles, weights):
    # check that the weights are normalized
    if (sum(weights)<0.999) or (sum(weights)>1.001):
    print "ERROR: !! weights are not normalized !!"
    N = len(particles); particles_new = range(N)
    w_sum = weights[0]; current_index = 0; current = random()/N
    for k in range(N):
    while current > w_sum:
    current_index += 1; w_sum += weights[current_index]
    particles_new[k] = particles[current_index]
    current += 1.0/N
    return particles_new
    #
    # SMC sampler
    #
    def smc(particles, delta, N_temperature, center_coord, std, save_file = False):
    weights = [1.0 for p in particles]
    N_particles = len(particles)
    for t in arange(1,N_temperature+1):
    print "iteration ",t, " out of ", N_temperature
    # evolve each particle with pi_t as target
    for k in range(N_particles):
    x,y = particles[k]
    x_new, y_new = proposal(x,y,delta)
    #compute log acceptance (for more stability)
    L_accept = log(density(x_new,y_new, center_coord, std)) – log(density(x,y, center_coord, std))
    if log(random()) < L_accept:
    particles[k] = (x_new, y_new)
    L_W = log(density(x,y, center_coord, std))
    L_W = L_W/N_temperature
    weights[k] = weights[k] * exp(L_W)
    # renormalise weights
    w_sum = sum(weights); weights = [w/w_sum for w in weights]
    ESS = compute_ESS(weights)
    print " ESS=", ESS
    # resampling is ess < N/2
    if ESS < N_particles/2:
    particles = resampling(particles, weights)
    weights = [1.0 for w in weights]
    if save_file:
    index = t
    name = "smc_sampler_" + str(1000+index)
    pylab.clf()
    radius = 30
    frame1 = pylab.gca()
    frame1.axes.get_xaxis().set_visible(False)
    frame1.axes.get_yaxis().set_visible(False)
    pylab.axis([-radius,radius,-radius,radius])
    particles_x = [x for (x,y) in particles]
    particles_y = [y for (x,y) in particles]
    pylab.plot(particles_x, particles_y, "ro", alpha=0.3)
    center_x = [x for (x,y) in center_coord]
    center_y = [y for (x,y) in center_coord]
    pylab.plot(center_x, center_y, "bo", )
    pylab.savefig(name)
    return resampling(particles, weights)
    #
    # CREATE TARGET + BRIDGE DENSITIES
    #
    center_coord = []; std = 1.0; radius = 20
    for theta in arange(0, 2*3.1415, 2*3.1415/7):
    center_coord.append( (radius*cos(theta), radius*sin(theta)) )
    #
    # CREATE PARTICLES SYSTEM
    #
    N_particles = 3000; N_temperature = 100
    particles = [(0.0, 0.0) for k in range(N_particles)]
    #
    # EVOLVE PARTICLES
    #
    delta = 0.6; save_file = True
    particles = smc(particles, delta, N_temperature, center_coord, std, save_file)
    particles_x = [x for (x,y) in particles]
    particles_y = [y for (x,y) in particles]
    #
    # PLOT THE RESULTS
    #
    pylab.figure()
    pylab.plot(particles_x, particles_y, "ro",alpha=0.7)
    pylab.show()


Leave a comment