Likelihood-free inference with emulator networks


    Approximate Bayesian Computation (ABC) provides methods for Bayesian inference in simulation-based stochastic models which do not permit tractable likelihoods. We present a new ABC method which uses probabilistic neural emulator networks to learn synthetic likelihoods on simulated data — both local emulators which approximate the likelihood for specific observed data, as well as global ones which are applicable to a range of data. Simulations are chosen adaptively using an acquisition function which takes into account uncertainty about either the posterior distribution of interest, or the parameters of the emulator. Our approach does not rely on user-defined rejection thresholds or distance functions. We illustrate inference with emulator networks on synthetic examples and on a biophysical neuron model, and show that emulators allow accurate and efficient inference even on high-dimensional problems which are challenging for conventional ABC approaches.


    Jan-Matthis Lueckmann, Giacomo Bassetto, Theofanis Karaletsos, Jakob H. Macke



    Full Paper

    ‘Likelihood-free inference with emulator networks’ (PDF)

    Uber AI

    Previous articleLeveraging Constraint Logic Programming for Neural Guided Program Synthesis
    Next articleRobust Dense Mapping for Large-Scale Dynamic Environments
    Theofanis Karaletsos
    Theofanis took his first steps as a machine learner at the Max Planck Institute For Intelligent Systems in collaboration with Microsoft Research Cambridge with work focused on unsupervised knowledge extraction from unstructured data, such as generative modeling of images and phenotyping for biology. He then moved to Memorial Sloan Kettering Cancer Center in New York, where he worked on machine learning in the context of cancer therapeutics. He joined a small AI startup Geometric Intelligence in 2016 and with his colleagues formed the new Uber AI Labs. Theofanis' research interests are focused on rich probabilistic modeling, approximate inference and probabilistic programming. His main passion are structured models, examples of which are spatio-temporal processes, models of image formation, deep probabilistic models and the tools needed to make them work on real data. His past in the life sciences has also made him keenly interested in how to make models interpretable and quantify their uncertainty, non-traditional learning settings such as weakly supervised learning and model criticism.