Adam Foster

Adam Foster

Probabilistic machine learning, deep learning, unsupervised representation learning, optimal experimental design, probabilistic programming

I am a DPhil student in Statistics at the University of Oxford, supervised by Prof Yee Whye Teh . I got my Bachelor’s and Master’s degrees in mathematics from Cambridge and worked as a machine learning engineer before joining the department. I have a broad range of interests in statistical machine learning. A large part of my work in Oxford has been on optimal experimental design: how do we design experiments that will be most informative about the process being investigated, whilst minimizing cost? I contribute to the deep probabilistic programming language Pyro and I am the main author of Pyro’s experimental design support. There is a deep mathematical connection between optimal experimental design and mutual information maximization, which is a key tool in unsupervised representation learning. I am currently interested in how we might use mutual information to learn good representations, and how to evaluate the quality of representations once we have them.

Other research interests of mine include the role of equivariance in machine learning, and Bayesian modelling, particularly of network data and transactional data.

Publications

2020

  • A. Foster , M. Jankowiak , M. O’Meara , Y. W. Teh , T. Rainforth , A Unified Stochastic Gradient Approach to Designing Bayesian-Optimal Experiments, International Conference on Artificial Intelligence and Statistics (AISTATS, to appear), 2020.
    Project: tencent-lsml

2019

  • A. Foster , M. Jankowiak , E. Bingham , P. Horsfall , Y. W. Teh , T. Rainforth , N. Goodman , Variational Bayesian Optimal Experimental Design, Advances in Neural Information Processing Systems (NeurIPS, spotlight), 2019.
    Project: bigbayes

2018

  • B. Bloem-Reddy , A. Foster , E. Mathieu , Y. W. Teh , Sampling and Inference for Beta Neutral-to-the-Left Models of Sparse Networks, in Conference on Uncertainty in Artificial Intelligence, 2018.
    Project: bigbayes
  • A. Foster , M. Jankowiak , E. Bingham , Y. W. Teh , T. Rainforth , N. Goodman , Variational Optimal Experiment Design: Efficient Automation of Adaptive Experiments, NeurIPS Workshop on Bayesian Deep Learning, 2018.
    Project: bigbayes

2017

  • B. Bloem-Reddy , E. Mathieu , A. Foster , T. Rainforth , H. Ge , M. Lomelí , Z. Ghahramani , Y. W. Teh , Sampling and inference for discrete random probability measures in probabilistic programs, NIPS Workshop on Advances in Approximate Bayesian Inference, 2017.
    Project: bigbayes