I am a second year StatML graduate student at the University of Oxford supervised by Tom Rainforth and Yee Whye Teh. I’m currently working on Bayesian Optimal Experimental Design (BOED) and I’m more broadly interested in probabilistic machine learning.
Before starting my PhD, I spent 4 years working in quant research/structuring in the City of London. Prior to that I studied MMORSE at the University of Warwick.
@article{rainforth2023modern,
author = {Rainforth, Tom and Foster, Adam and Ivanova, Desi R. and Bickford Smith, Freddie},
year = {2024},
title = {Modern {Bayesian} experimental design},
journal = {Statistical Science}
}
2021
A. Foster
,
D. R. Ivanova
,
I. Malik
,
T. Rainforth
,
Deep Adaptive Design: Amortizing Sequential Bayesian Experimental Design, International Conference on Machine Learning (ICML, long presentation), 2021.
We introduce Deep Adaptive Design (DAD), a general method for amortizing the cost of performing sequential adaptive experiments using the framework of Bayesian optimal experimental design (BOED). Traditional sequential BOED approaches require substantial computational time at each stage of the experiment. This makes them unsuitable for most real-world applications, where decisions must typically be made quickly. DAD addresses this restriction by learning an amortized design network upfront and then using this to rapidly run (multiple) adaptive experiments at deployment time. This network takes as input the data from previous steps, and outputs the next design using a single forward pass; these design decisions can be made in milliseconds during the live experiment. To train the network, we introduce contrastive information bounds that are suitable objectives for the sequential setting, and propose a customized network architecture that exploits key symmetries. We demonstrate that DAD successfully amortizes the process of experimental design, outperforming alternative strategies on a number of problems.
@article{foster2021deep,
title = {{Deep Adaptive Design: Amortizing Sequential Bayesian Experimental Design}},
author = {Foster, Adam and Ivanova, Desi R and Malik, Ilyas and Rainforth, Tom},
year = {2021},
journal = {International Conference on Machine Learning (ICML, long presentation)}
}
D. R. Ivanova
,
A. Foster
,
S. Kleinegesse
,
M. U. Gutmann
,
T. Rainforth
,
Implicit Deep Adaptive Design: Policy-Based Experimental Design without Likelihoods, 35th Conference on Neural Information Processing Systems (NeurIPS 2021), 2021.
We introduce implicit Deep Adaptive Design (iDAD), a new method for performing adaptive experiments in real-time with implicit models. iDAD amortizes the cost of Bayesian optimal experimental design (BOED) by learning a design policy network upfront, which can then be deployed quickly at the time of the experiment. The iDAD network can be trained on any model which simulates differentiable samples, unlike previous design policy work that requires a closed form likelihood and conditionally independent experiments. At deployment, iDAD allows design decisions to be made in milliseconds, in contrast to traditional BOED approaches that require heavy computation during the experiment itself. We illustrate the applicability of iDAD on a number of experiments, and show that it provides a fast and effective mechanism for performing adaptive design with implicit models.
@article{ivanova2021implicit,
title = {{Implicit Deep Adaptive Design: Policy-Based Experimental Design without Likelihoods}},
author = {Ivanova, Desi R and Foster, Adam and Kleinegesse, Steven and Gutmann, Michael U and Rainforth, Tom},
year = {2021},
journal = {35th Conference on Neural Information Processing Systems (NeurIPS 2021)}
}