Bayesian nonparametrics, probabilistic modeling and inference

I am a Postdoctoral Research Assistant in Statistical Machine Learning. Previously, I completed my Ph.D. in Statistics at Columbia University, where I was advised by Peter Orbanz. My research focuses on probabilistic and statistical analysis of networks and other discrete data like partitions and permutations. I am generally interested in all aspects of machine learning, both theoretical and applied.

Publications

2017

B. Bloem-Reddy
,
Discussion of F. Caron and E. B. Fox, "Sparse graphs using exchangeable random measures.", Journal of the Royal Statistical Society: Series B (Statistical Methodology), vol. 79, no. 5, 2017.

@article{BloemReddy:2017aa,
author = {Bloem-Reddy, B.},
journal = {Journal of the Royal Statistical Society: Series B (Statistical Methodology)},
number = {5},
title = {Discussion of F. Caron and E. B. Fox, "Sparse graphs using exchangeable random measures."},
volume = {79},
year = {2017}
}

B. Bloem-Reddy
,
P. Orbanz
,
Preferential Attachment and Vertex Arrival Times, Oct. 2017.

@article{Bloem-Reddy:Orbanz:2017,
archiveprefix = {arXiv},
author = {Bloem-Reddy, Benjamin and Orbanz, Peter},
eprint = {1710.02159},
month = oct,
primaryclass = {math.PR},
title = {Preferential Attachment and Vertex Arrival Times},
year = {2017}
}

B. Bloem-Reddy
,
E. Mathieu
,
A. Foster
,
T. Rainforth
,
H. Ge
,
M. Lomelí
,
Z. Ghahramani
,
Y. W. Teh
,
Sampling and inference for discrete random probability measures in probabilistic programs, NIPS Workshop on Advances in Approximate Bayesian Inference, 2017.

We consider the problem of sampling a sequence from a discrete random probability measure (RPM) with countable support, under (probabilistic) constraints of finite memory and computation. A canonical example is sampling from the Dirichlet Process, which can be accomplished using its stick-breaking representation and lazy initialization of its atoms. We show that efficiently lazy initialization is possible if and only if a size-biased representation of the discrete RPM is used. For models constructed from such discrete RPMs, we consider the implications for generic particle-based inference methods in probabilistic programming systems. To demonstrate, we implement SMC for Normalized Inverse Gaussian Process mixture models in Turing.

@article{bloemreddy2017rpm,
title = {Sampling and inference for discrete random probability measures in probabilistic programs},
author = {Bloem-Reddy, Benjamin and Mathieu, Emile and Foster, Adam and Rainforth, Tom and Ge, Hong and Lomelí, María and Ghahramani, Zoubin and Teh, Yee Whye},
journal = {NIPS Workshop on Advances in Approximate Bayesian Inference},
year = {2017}
}

2016

B. Bloem-Reddy
,
J. P. Cunningham
,
Slice Sampling on Hamiltonian Trajectories, in International Conference on Machine Learning (ICML), 2016, vol. 33, 3050—3058.

@inproceedings{BloemReddy:Cunningham:2016,
author = {Bloem-Reddy, B. and Cunningham, J. P.},
booktitle = {International Conference on Machine Learning (ICML)},
pages = {3050---3058},
title = {Slice Sampling on Hamiltonian Trajectories},
volume = {33},
year = {2016}
}

B. Bloem-Reddy
,
P. Orbanz
,
Random Walk Models of Network Formation and Sequential Monte Carlo Methods for Graphs, Dec. 2016.

@article{Bloem-Reddy:Orbanz:2016,
archiveprefix = {arXiv},
author = {Bloem-Reddy, Benjamin and Orbanz, Peter},
eprint = {1612.06404},
month = dec,
primaryclass = {stat.ME},
title = {Random Walk Models of Network Formation and Sequential Monte Carlo Methods for Graphs},
year = {2016}
}