Adam Foster

Adam Foster

Probabilistic inference, optimal experiment design, deep learning, probabilistic programming

I am a DPhil student in Statistics at the University of Oxford, supervised by Yee Whye Teh. I got my Bachelor’s and Master’s degrees in mathematics from Cambridge and worked as a machine learning engineer before joining the department. I am interested in statistical machine learning, with a current focus on optimal experiment design and mutual information estimation. I contribute to the deep probabilistic programming language Pyro, and use it in my research. Other research interests of mine include Bayesian nonparametric models of discrete structures, deep generative models, and Bayesian optimisation.

Publications

2018

  • B. Bloem-Reddy , A. Foster , E. Mathieu , Y. W. Teh , Sampling and Inference for Beta Neutral-to-the-Left Models of Sparse Networks, in Conference on Uncertainty in Artificial Intelligence, 2018.
    Project: bigbayes
  • A. Foster , M. Jankowiak , E. Bingham , Y. W. Teh , T. Rainforth , N. Goodman , Variational Optimal Experiment Design: Efficient Automation of Adaptive Experiments, NeurIPS Workshop on Bayesian Deep Learning, 2018.
    Project: bigbayes

2017

  • B. Bloem-Reddy , E. Mathieu , A. Foster , T. Rainforth , H. Ge , M. Lomelí , Z. Ghahramani , Y. W. Teh , Sampling and inference for discrete random probability measures in probabilistic programs, NIPS Workshop on Advances in Approximate Bayesian Inference, 2017.
    Project: bigbayes