• PhD2014 - Present
    Machine Learning

    University of Oxford, Department of Statistics

  • M. Sc.2012 - 2014
    Applied Mathematics

    ETH Zurich, Department of Mathematics

  • B. Sc.2009 - 2014
    Mathematics

    ETH Zurich, Department of Mathematics

DR-ABC: Approximate Bayesian Computation with Kernel-Based Distribution Regression

Jovana Mitrovic, Dino Sejdinovic, Yee Whye Teh
ICML 2016

Abstract

Performing exact posterior inference in complex generative models is often difficult or impossible due to an expensive to evaluate or intractable likelihood function. Approximate Bayesian computation (ABC) is an inference framework that constructs an approximation to the true likelihood based on the similarity between the observed and simulated data as measured by a predefined set of summary statistics. Although the choice of appropriate problem-specific summary statistics crucially influences the quality of the likelihood approximation and hence also the quality of the posterior sample in ABC, there are only few principled general-purpose approaches to the selection or construction of such summary statistics. In this paper, we develop a novel framework for this task using kernel-based distribution regression. We model the functional relationship between data distributions and the optimal choice (with respect to a loss function) of summary statistics using kernel-based distribution regression. We show that our approach can be implemented in a computationally and statistically efficient way using the random Fourier features framework for large-scale kernel learning. In addition to that, our framework shows superior performance when compared to related methods on toy and real-world problems.

Deep Kernel Machines via the Kernel Reparametrization Trick

Jovana Mitrovic, Dino Sejdinovic, Yee Whye Teh
Workshop Track ICLR 2017

Abstract

While deep neural networks have achieved state-of-the-art performance on many tasks across varied domains, they still remain black boxes whose inner workings are hard to interpret and understand. In this paper, we develop a novel method for efficiently capturing the behaviour of deep neural networks using kernels. In par- ticular, we construct a hierarchy of increasingly complex kernels that encode indi- vidual hidden layers of the network. Furthermore, we discuss how our framework motivates a novel supervised weight initialization method that discovers highly discriminative features already at initialization.

  • Research Internto start in June 2017

    DeepMind. London, UK

  • Research Intern08/2016-11/2016

    Supervisors: Yoshua Bengio and Aaron Courville

    MILA, University of Montreal, Canada

  • 2014
    Clarendon Fund Scholarship
    • Awarded to the top 3% of accepted graduate students across all disciplines at the University of Oxford
    • Full scholarship for PhD studies
  • 2008
    Scholarship for academic excellence
    • Awarded to the best high school students in the City of Belgrad, Serbia
  • 2008
    Award for academic excellence
    • Awarded to the best high school students in the Republic of Serbia
    • Awarded by the Dositeja - Fund for Young Talents, Ministry of Youth and Sports