As datasets grow ever larger in scale, complexity and variety, there is an increasing need for powerful machine learning and statistical techniques that are capable of learning from such data. Bayesian nonparametrics is a promising approach to data analysis that is increasingly popular in machine learning and statistics. Bayesian nonparametric models are highly flexible models with infinite-dimensional parameter spaces that can be used to directly parameterise and learn about functions, densities, conditional distributions etc. This ERC funded project aims to develop Bayesian nonparametric techniques for learning rich representations from structured data in a computationally efficient and scalable manner.

## 2019

• , Y. Lee , J. Kim , A. Kosiorek , S. Choi , , Set Transformer: A Framework for Attention-based Permutation-Invariant Neural Networks, in International Conference on Machine Learning (ICML), 2019.
Project: bigbayes
• L. T. Elliott , M. De Iorio , S. Favaro , K. Adhikari , , Modeling Population Structure Under Hierarchical Dirichlet Processes, Bayesian Analysis, Jun. 2019.
Project: bigbayes
• , , , M. P. Kumar , A Statistical Approach to Assessing Neural Network Robustness, in International Conference on Learning Representations (ICLR), 2019.
Project: bigbayes
• , L. James , S. Choi , , A Bayesian model for sparse graphs with flexible degree distributionand overlapping community structure, in Artificial Intelligence and Statistics (AISTATS), 2019.
Project: bigbayes
• , , Probabilistic symmetry and invariant neural networks, Jan. 2019.
Project: bigbayes
• , M. Chirico , P. Pereira , C. Loeffler , Scalable high-resolution forecasting of sparse spatiotemporal events with kernel methods: a winning solution to the NIJ “Real-Time Crime Forecasting Challenge,” Revised and resubmit at Annals of Applied Statistics, 2019.
Project: bigbayes
• A. Raj , H. Law , , M. Park , A Differentially Private Kernel Two-Sample Test, in European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases (ECML-PKDD), 2019, to appear.
Project: bigbayes
• , , N. Siddharth , , Disentangling Disentanglement in Variational Autoencoders, in Proceedings of the 36th International Conference on Machine Learning, Long Beach, California, USA, 2019, vol. 97, 4402–4412.
Project: bigbayes
• , M. Jankowiak , E. Bingham , P. Horsfall , , , N. Goodman , Variational Bayesian Optimal Experimental Design, Advances in Neural Information Processing Systems (NeurIPS, spotlight), 2019.
Project: bigbayes
• F. Locatello , G. Abbati , , S. Bauer , B. Schölkopf , O. Bachem , On the Fairness of Disentangled Representations, Advances in Neural Information Processing Systems (NeurIPS, to appear), 2019.
Project: bigbayes
• , F. Wood , , Amortized Monte Carlo Integration, International Conference on Machine Learning (ICML, Best Paper honorable mention), 2019.
Project: bigbayes
• , , T. Kohn , , H. Yang , F. Wood , A Low-Level Probabilistic Programming Language for Non-Differentiable Models, International Conference on Artificial Intelligence and Statistics (AISTATS), 2019.
Project: bigbayes
• A. Golinski* , M. Lezcano-Casado* , , Improving Normalizing Flows via Better Orthogonal Parameterizations, ICML Workshop on Invertible Neural Nets and Normalizing Flows, 2019.
Project: bigbayes
• , L. Chan , , , Noise Contrastive Meta-Learning for Conditional Density Estimation using Kernel Mean Embeddings, ArXiv e-prints:1906.02236, 2019.
• , , T. Kohn , , H. Yang , F. Wood , LF-PPL: A Low-Level First Order Probabilistic Programming Language for Non-Differentiable Models, in The 22nd International Conference on Artificial Intelligence and Statistics, 2019, 148–157.
Project: bigbayes
• , H. Yang , , , Divide, Conquer, and Combine: a New Inference Strategy for Probabilistic Programs with Stochastic Support, International Conference on Machine Learning (ICML, to appear), 2019.
Project: bigbayes

## 2018

• , , , Modelling sparsity, heterogeneity, reciprocity and community structure in temporal interaction data, in Advances in Neural Information Processing Systems (NeurIPS), 2018.
Project: bigbayes
• , , F. Wood , , Amortized Monte Carlo Integration, in Symposium on Advances in Approximate Bayesian Inference, 2018.
Project: bigbayes
• , , Y. Teh , Causal Inference via Kernel Deviance Measures, in Advances in Neural Information Processing Systems (NeurIPS), 2018.
Project: bigbayes
• J. Chen , J. Zhu , , T. Zhang , Stochastic Expectation Maximization with Variance Reduction, in Advances in Neural Information Processing Systems (NeurIPS), 2018, 7978–7988.
• , , , , Sampling and Inference for Beta Neutral-to-the-Left Models of Sparse Networks, in Conference on Uncertainty in Artificial Intelligence, 2018.
Project: bigbayes
• , P. Orbanz , Random-Walk Models of Network Formation and Sequential Monte Carlo Methods for Graphs, Journal of the Royal Statistical Society: Series B (Statistical Methodology), vol. 80, no. 5, 871–898, Aug. 2018.
Project: bigbayes
• , , T. A. Le , , M. Igl , F. Wood , , Tighter Variational Bounds are Not Necessarily Better, in International Conference on Machine Learning (ICML), 2018.
Project: bigbayes
• , A. Perotte , N. Elhadad , R. Ranganath , Deep Survival Analysis: Nonparametrics and Missingness, in pmlr, 2018.
Project: bigbayes
• , S. Favaro , D. M. Roy , , A Characterization of Product-Form Exchangeable Feature Probability Functions, Annals of Applied Probability, vol. 28, Jun. 2018.
Project: bigbayes
• , , Scaling up the Automatic Statistician: Scalable Structure Discovery using Gaussian Processes, in Artificial Intelligence and Statistics (AISTATS), 2018.
Project: bigbayes
• , , A. Gretton , , Large-Scale Kernel Methods for Independence Testing, Statistics and Computing, vol. 28, no. 1, 113–130, Jan. 2018.
Project: bigbayes
• , , F. Camerlenghi , S. Favaro , Consistent estimation of the missing mass for feature models, 2018.
Project: bigbayes
• , S. Favaro , , Bayesian nonparametric approaches to sample-size estimation for finding unseen species, 2018.
Project: bigbayes
• , , F. Camerlenghi , S. Favaro , On the consistent estimation of the missing mass, 2018.
Project: bigbayes
• , , F. Camerlenghi , S. Favaro , On the Good-Turing estimator for feature allocation models, 2018.
Project: bigbayes
• , , Neural network models of exchangeable sequences, NeurIPS Workshop on Bayesian Deep Learning, 2018.
Project: bigbayes
• C. Loeffler , , Is gun violence contagious? A spatiotemporal test, Journal of Quantitative Criminology, vol. 34, no. 4, 999–1017, 2018.
Project: bigbayes
• , M. Jankowiak , E. Bingham , , , N. Goodman , Variational Optimal Experiment Design: Efficient Automation of Adaptive Experiments, NeurIPS Workshop on Bayesian Deep Learning, 2018.
Project: bigbayes
• , , R. Zinkov , N. Siddharth , , , F. Wood , Faithful Inversion of Generative Models for Effective Amortized Inference, in Advances in Neural Information Processing Systems (NeurIPS), 2018.
Project: bigbayes
• , , , I. Posner , Sequential Attend, Infer, Repeat: Generative Modelling of Moving Objects, in Advances in Neural Information Processing Systems (NeurIPS), 2018.
Project: bigbayes
• , A. Mnih , Disentangling by Factorising, in International Conference on Machine Learning (ICML), 2018.
Project: bigbayes
• H. Law , , E. Cameron , T. Lucas , , K. Battle , K. Fukumizu , Variational Learning on Aggregate Outputs with Gaussian Processes, in Advances in Neural Information Processing Systems (NeurIPS), 2018, to appear.
Project: bigbayes
• J. Heo , H. Lee , S. Kim , , K. Kim , E. Yang , S. Hwang , Uncertainty-aware attention for reliable interpretation and prediction, in Advances in Neural Information Processing Systems (NeurIPS), 2018.
Project: bigbayes
• H. Lee , , S. Kim , E. Yang , S. Hwang , DropMax: adaptive variational softmax, in Advances in Neural Information Processing Systems (NeurIPS), 2018.
Project: bigbayes
• , , , , F. Wood , H. Yang , J. Meent , Inference Trees: Adaptive Inference with Exploration, arXiv preprint arXiv:1806.09550, 2018.
Project: bigbayes
• , , , J. Meent , , On Exploration, Exploitation and Learning in Adaptive Importance Sampling, arXiv preprint arXiv:1810.13296, 2018.
Project: bigbayes
• , Nesting Probabilistic Programs, Conference on Uncertainty in Artificial Intelligence (UAI), 2018.
Project: bigbayes
• , R. Cornish , H. Yang , A. Warrington , F. Wood , On Nesting Monte Carlo Estimators, International Conference on Machine Learning (ICML), 2018.
Project: bigbayes
• , , , S. Bhatt , Spatial Mapping with Gaussian Processes and Nonstationary Fourier Features, Spatial Statistics, vol. 28, 59–78, 2018.
Project: bigbayes
• H. Law , D. Sutherland , , , Bayesian Approaches to Distribution Regression, in Artificial Intelligence and Statistics (AISTATS), 2018.
Project: bigbayes

## 2017

• , P. A. Jenkins , D. Spano , , Poisson Random Fields for Dynamic Feature Models, Journal of Machine Learning Research (JMLR), Dec. 2017.
Project: bigbayes
• G. Di Benedetto , , , Non-exchangeable random partition models for microclustering, Nov-2017.
Project: bigbayes
• , P. Orbanz , Preferential Attachment and Vertex Arrival Times, Oct. 2017.
Project: bigbayes
• A. Todeschini , , , Exchangeable Random Measures for Sparse and Modular Graphs with Overlapping Communities, Aug-2017.
Project: bigbayes
• J. Arbel , S. Favaro , B. Nipoti , , Bayesian nonparametric inference for discovery probabilities: credible intervals and large sample asymptotics, Statistica Sinica, Apr. 2017.
Project: bigbayes
• , , , , , Relativistic Monte Carlo, in Artificial Intelligence and Statistics (AISTATS), 2017.
Project: bigbayes
• , S. Favaro , Discussion of F. Caron and E. B. Fox, "Sparse graphs using exchangeable random measures.", Journal of the Royal Statistical Society: Series B (Statistical Methodology), vol. 79, no. 5, 2017.
Project: bigbayes
• S. Bacallado , , S. Favaro , L. Trippa , Sufficientness postulates for Gibbs-type priors and hierarchial generalizations, Statistical Sciences, vol. 32, 487–500, 2017.
Project: bigbayes
• B. Goodman , , European Union Regulations on Algorithmic Decision Making and a “Right to Explanation,” AI Magazine, vol. 38, no. 3, 50–58, 2017.
Project: bigbayes
• , , , S. Vollmer , B. Lakshminarayanan , C. Blundell , , Distributed Bayesian Learning with Stochastic Natural Gradient Expectation Propagation and the Posterior Server, Journal of Machine Learning Research, vol. 18, no. 106, 1–37, 2017.
Project: bigbayes sgmcmc
• , D. Belgrave , A Birth-Death Modelling Framework for Inferring Disease Causality within the Context of Allergy Development., in 16th IEEE International Conference on Machine Learning and Applications (ICMLA), 2017.
Project: bigbayes
• , D. A. Knowles , Z. Ghahramani , A birth-death process for feature allocation., in Proceedings of the 34th International Conference on Machine Learning, 2017.
Project: bigbayes
• , , , , H. Ge , M. Lomelí , Z. Ghahramani , , Sampling and inference for discrete random probability measures in probabilistic programs, NIPS Workshop on Advances in Approximate Bayesian Inference, 2017.
Project: bigbayes
• , Y. Teh , , Poisson Intensity Estimation with Reproducing Kernels, Electronic Journal of Statistics, vol. 11, no. 2, 5081–5104, 2017.
Project: bigbayes
• , , , , Feature-to-Feature Regression for a Two-Step Conditional Independence Test, in Uncertainty in Artificial Intelligence (UAI), 2017.
Project: bigbayes
• , , , Deep Kernel Machines via the Kernel Reparametrization Trick, in International Conference on Learning Representations (ICLR) Workshop Track, 2017.
Project: bigbayes
• , , , Poisson Intensity Estimation with Reproducing Kernels, in Artificial Intelligence and Statistics (AISTATS), 2017.
Project: bigbayes
• M. Lomeli , S. Favaro , , A Marginal Sampler for σ-Stable Poisson-Kingman Mixture Models, Journal of Computational and Graphical Statistics, 2017.
Project: bigbayes

## 2016

• , D. Sutherland , Y. Wang , , Understanding the 2016 US Presidential Election using ecological inference and distribution regression with census microdata, Arxiv e-prints, Nov-2016.
Project: bigbayes
• , , , A Bayesian nonparametric model for sparse dynamic networks, Jun-2016.
Project: bigbayes
• N. Heard , , M. Skoularidou , Topic modelling of authentication events in an enterprise computer network, 2016.
Project: bigbayes
• , , , DR-ABC: Approximate Bayesian Computation with Kernel-Based Distribution Regression, in International Conference on Machine Learning (ICML), 2016, 1482–1491.
Project: bigbayes
• , , J. Cunningham , , Bayesian Learning of Kernel Embeddings, in Uncertainty in Artificial Intelligence (UAI), 2016, 182–191.
Project: bigbayes
• T. Fernandez , , Posterior Consistency for a Non-parametric Survival Model under a Gaussian Process Prior, 2016.
Project: bigbayes
• T. Fernandez , N. Rivera , , Gaussian Processes for Survival Analysis, in Advances in Neural Information Processing Systems (NeurIPS), 2016.
Project: bigbayes
• , , Scalable Structure Discovery in Regression using Gaussian Processes, in Proceedings of the 2016 Workshop on Automatic Machine Learning, 2016.
Project: bigbayes
• L. T. Elliott , , A Nonparametric HMM for Genetic Imputation and Coalescent Inference, Electronic Journal of Statistics, 2016.
Project: bigbayes
• S. Favaro , A. Lijoi , C. Nava , B. Nipoti , I. Prüenster , , On the Stick-Breaking Representation for Homogeneous NRMIs, Bayesian Analysis, vol. 11, 697–724, 2016.
Project: bigbayes
• , Bayesian Nonparametric Modelling and the Ubiquitous Ewens Sampling Formula, Statistical Science, vol. 31, no. 1, 34–36, 2016.
Project: bigbayes
• M. Balog , B. Lakshminarayanan , Z. Ghahramani , D. M. Roy , , The Mondrian Kernel, in Uncertainty in Artificial Intelligence (UAI), 2016.
Project: bigbayes
• B. Lakshminarayanan , D. M. Roy , , Mondrian Forests for Large-Scale Regression when Uncertainty Matters, in Artificial Intelligence and Statistics (AISTATS), 2016.
Project: bigbayes
• , , , Bayesian Nonparametrics for Sparse Dynamic Networks, 2016.
Project: bigbayes
• , S. Favaro , , Multi-armed bandit for species discovery: A Bayesian nonparametric approach, Journal of the American Statistical Association, 2016.
Project: bigbayes

## 2015

• A. G. Deshwar , , J. Wintersinger , P. C. Boutros , , Q. Morris , Abstract B2-59: PhyloSpan: using multimutation reads to resolve subclonal architectures from heterogeneous tumor samples, AACR Special Conference on Computational and Systems Biology of Cancer, vol. 75, 2015.
Project: bigbayes
• S. Favaro , B. Nipoti , , Rediscovery of Good-Turing Estimators via Bayesian Nonparametrics, Biometrics, 2015.
Project: bigbayes
• P. G. Moreno , A. Artés-Rodríguez , , F. Perez-Cruz , Bayesian Nonparametric Crowdsourcing, Journal of Machine Learning Research (JMLR), 2015.
Project: bigbayes
• M. Lomeli , S. Favaro , , A hybrid sampler for Poisson-Kingman mixture models, in Advances in Neural Information Processing Systems (NeurIPS), 2015.
Project: bigbayes
• M. De Iorio , S. Favaro , , Bayesian Inference on Population Structure: From Parametric to Nonparametric Modeling, in Nonparametric Bayesian Inference in Biostatistics, Springer, 2015.
Project: bigbayes
• S. Favaro , B. Nipoti , , Random variate generation for Laguerre-type exponentially tilted α-stable distributions, Electronic Journal of Statistics, vol. 9, 1230–1242, 2015.
Project: bigbayes
• M. Balog , , The Mondrian Process for Machine Learning, 2015.
Project: bigbayes
• P. Orbanz , L. James , , Scaled subordinators and generalizations of the Indian buffet process, 2015.
Project: bigbayes
• M. De Iorio , L. Elliott , S. Favaro , , Bayesian Nonparametric Inference of Population Admixtures, 2015.
Project: bigbayes
• B. Lakshminarayanan , D. M. Roy , , Particle Gibbs for Bayesian Additive Regression Trees, in Proceedings of the International Conference on Artificial Intelligence and Statistics, 2015.
Project: bigbayes sgmcmc

## 2014

• S. Favaro , M. Lomeli , , On a Class of σ-stable Poisson-Kingman Models and an Effective Marginalized Sampler, Statistics and Computing, 2014.
Project: bigbayes
• S. Favaro , M. Lomeli , B. Nipoti , , On the Stick-Breaking Representation of σ-stable Poisson-Kingman Models, Electronic Journal of Statistics, vol. 8, 1063–1085, 2014.
Project: bigbayes
• B. Lakshminarayanan , D. Roy , , Mondrian Forests: Efficient Online Random Forests, in Advances in Neural Information Processing Systems (NeurIPS), 2014.
Project: bigbayes

## 2019

• , , T. Kohn , , H. Yang , F. Wood , A Low-Level Probabilistic Programming Language for Non-Differentiable Models, International Conference on Artificial Intelligence and Statistics (AISTATS). 2019.
Project: bigbayes