A list of all the posts and pages found on the site. For you robots out there is an XML version available for digesting as well.



Resource Management with Slurm

3 minute read


slurm (Simple Linux Utility for Resource Management) is a resouce manager for running compute jobs across multiple servers. Although this has the benefit of additional control, it imposes constraints on compute resources and constrains interaction with servers to the slurm interface, this can be a pain. This post aims to be a useful go-to guide to common slurm commands and examples of how slurm may be used without the pain.

Conda Environments

1 minute read


Virtual environments are a convenient way to manage library dependencies, environment variables, and ensure reproducibility. There are a couple of approaches to this: virtualenv,conda, and docker – see here for a discussion. This post will focus on conda, and give a few practical commands to get up and running.


MCMC for Bayesian Non-Parametric Mixture Models


This dissertation concerns the use of Markov Chain Monte Carlo (MCMC) procedures in performing Bayesian inference on non-parametric mixture models. In particular, this report will focus on Dirichlet Process Mixture Models.

Download here

Masked Bouncy Particle Sampler


Piecewise deterministic Markov Processes (PDMP) provide the foundation for a promising class of non-reversible, continuous-time Markov Chain Monte Carlo (MCMC) procedures and have been shown experimentally to enjoy attractive scaling properties in high-dimensional settings. This work introduces the Masked Bouncy Particle Sampler (BPS), a flexible MCMC procedure within the PDMP framework that exploits model structure and modern parallel computing resources using chromatic spatial partitioning ideas from the discrete-time MCMC literature (\cite{gonzalez2011parallel}). We extend the basic procedure by introducing a dynamic factorisation scheme of the target distribution to reduce boundary effects commonly associated to fixed partitioning. The validity of the proposed method is theoretically justified and we provide experimental evidence that the Masked Bouncy Particle Sampler delivers significant efficiency gains over other state-of-the-art sampling schemes for certain high-dimensional sparse models.

Download here

Differentiable Particle Filtering


Particle Filtering (PF) methods are an established class of procedures for performing inference in non-linear state-space models. Resampling is a key ingredient of PF, necessary to obtain low variance likelihood and states estimates. However, traditional resampling methods result in PF-based loss functions being non-differentiable with respect to model and PF parameters. In a variational inference context, resampling also yields high variance gradient estimates of the PF-based evidence lower bound. By leveraging optimal transport ideas, we introduce a principled differentiable particle filter and provide convergence results. We demonstrate this novel method on a variety of applications.

Download here





Undergraduate course, Balliol College, University of Oxford, 2019