Emile Mathieu

Emile Mathieu

Probabilistic inference, Deep learning, Generative models, Representation Learning, Geometry

I am a second year DPhil student of statistics at the University of Oxford supervised by Prof. Yee Whye Teh and Ryota Tomioka from Microsoft Research. Previously, I received a joint MSc. from Ecole des Ponts ParisTech and Ecole Normale Supérieure Paris-Saclay. My research interests lie in the fields of probabilistic generative models, representation learning and geometry.

Publications

2021

  • E. Mathieu , A. Foster , Y. W. Teh , On Contrastive Representations of Stochastic Processes, 35th Conference on Neural Information Processing Systems (NeurIPS 2021), 2021.

2020

  • E. Mathieu , M. Nickel , Riemannian Continuous Normalizing Flows, in Advances in Neural Information Processing Systems 33, 2020.

2019

  • E. Mathieu , C. Le Lan , C. J. Maddison , R. Tomioka , Y. W. Teh , Continuous Hierarchical Representations with Poincaré Variational Auto-Encoders, in Advances in Neural Information Processing Systems 32, 2019, 12565–12576.
  • E. Mathieu , T. Rainforth , N. Siddharth , Y. W. Teh , Disentangling Disentanglement in Variational Autoencoders, in Proceedings of the 36th International Conference on Machine Learning, Long Beach, California, USA, 2019, vol. 97, 4402–4412.
    Project: bigbayes

2018

  • B. Bloem-Reddy , A. Foster , E. Mathieu , Y. W. Teh , Sampling and Inference for Beta Neutral-to-the-Left Models of Sparse Networks, in Conference on Uncertainty in Artificial Intelligence, 2018.
    Project: bigbayes

2017

  • B. Bloem-Reddy , E. Mathieu , A. Foster , T. Rainforth , H. Ge , M. Lomelí , Z. Ghahramani , Y. W. Teh , Sampling and inference for discrete random probability measures in probabilistic programs, NIPS Workshop on Advances in Approximate Bayesian Inference, 2017.
    Project: bigbayes