News

NeurIPS 2018 Workshop Participation
Members of the group are organizing and participating in a number of NeurIPS 2018 Workshops and a colocated symposium:

Tom Rainforth is giving an invited talk, “Inference Trees: Adaptive Inference with Exploration”, at the Symposium on Advances in Approximate Bayesian Inference.

Ben BloemReddy is giving an invited talk, “Leftneutrality: an old friend in the mirror”, at the NeurIPS Workshop on Bayesian Nonparametrics.

Tom Rainforth, Ben BloemReddy, and Yee Whye Teh are coorganizing (with Brooks Paige, Matt Kusner, and Rick Caruana) a workshop on Critiquing and Correcting Trends in Machine Learning, scheduled for Friday, December 7.

Yee Whye Teh is coorganizing the NeurIPS Workshop on Continual Learning, scheduled for Friday, December 7.
Contributed papers being presented as posters and/or spotlights:

Ho Chung Leon Law, Peilin Zhao, Junzhou Huang, and Dino Sejdinovic. Hyperparameter Learning via Distributional Transfer. NeurIPS Workshop on MetaLearning.

Emile Mathieu*, Tom Rainforth*, Siddharth Narayanaswamy* and Yee Whye Teh. Disentangling Disentanglement. NeurIPS Workshop on Bayesian Deep Learning. (*Equal contribution.)

Adam Foster, Martin Jankowiak, Eli Bingham, Yee Whye Teh, Tom Rainforth and Noah Goodman. Variational Optimal Experiment Design: Efficient Automation of Adaptive Experiments. NeurIPS Workshop on Bayesian Deep Learning.

Benjamin BloemReddy and Yee Whye Teh. Neural network models of exchangeable sequences. NeurIPS Workshop on Bayesian Deep Learning.

Hyunjik Kim, Andriy Mnih, Jonathan Schwarz, Marta Garnelo, Ali Eslami, Dan Rosenbaum, Oriol Vinyals, Yee Whye Teh. Attentive Neural Processes. NeurIPS Workshop on Bayesian Deep Learning.

Tuan Anh Le, Hyunjik Kim, Marta Garnelo, Dan Rosenbaum, Jonathan Schwarz, Yee Whye Teh. Empirical Evaluation of Neural Process Objectives. NeurIPS Workshop on Bayesian Deep Learning.

Aki Matsukawa, Yee Whye Teh, Dilan Gorur, Balaji Lakshminarayanan. Hybrid Models with Deep and Invertible Features. NeurIPS Workshop on Bayesian Deep Learning.

Dieterich Lawson, George Tucker, Christian A. Naesseth, Christopher Maddison, Ryan P. Adams, Yee Whye Teh. Twisted Variational Sequential Monte Carlo.

Balaji Lakshminarayanan, Aki Matsukawa, Dilan Gorur, Yee Whye Teh. Do Deep Generative Models Know What They Don’t Know? NeurIPS Workshop on Bayesian Deep Learning.

Jovana Mitrovic, Peter Wirnsberger, Charles Blundell, Dino Sejdinovic, Yee Whye Teh. Infinitely Deep InfiniteWidth Networks.

Tim G. J. Rudner, Vincent Fortuin, Yee Whye Teh, Yarin Gal. On the Connection between Neural Processes and Approximate Gaussian Processes. NeurIPS Workshop on Bayesian Deep Learning.

Tim G. J. Rudner, Marc Rußwurm, Jakub Fil, Ramona Pelich, Benjamin Bischke, Veronika Kopackova, Piotr Bilinski. Rapid Computer Visionaided Disaster Response via Fusion of Multiresolution, Multisensor, and Multitemporal Satellite Imagery. NeurIPS Workshop on AI for Social Good.

Matthew Fellows, Anuj Mahajan, Tim G. J. Rudner, Shimon Whiteson. VIREL: A Variational Inference Framework for Reinforcement Learning. NeurIPS Workshop on Probabilistic Reinforcement Learning and Structured Control.

Aidan N. Gomez, Ivan Zhang, Kevin Swersky, Yarin Gal, and Geoffrey E. Hinton. Targeted Dropout. NeurIPS Workshop on Compact Deep Neural Network Representation with Industrial Applications.

Tammo Rukat and Christopher Yau. Bayesian Nonparametric Boolean Factor Models. NeurIPS Workshop on Bayesian Nonparametrics.

Bradley GramHansen, Patrick Helber, Indhu Varatharajan, Faiza Azam, Alejandro Coca Castro, Veronika Kopačková, Piotr Bilinski. Generating Material Maps to Map Informal Settlements. NeurIPS Workshop on Machine Learning for the Developing World.

Stefan Webb, Tom Rainforth, Yee Whye Teh, M. Pawan Kumar. Statistical Verification of Neural Networks. NeurIPS Workshop on Security in Machine Learning.

Matthew Willetts, Aiden Doherty, Stephen Roberts, Chris Holmes. Semiunsupervised Learning using Deep Generative Models. NeurIPS Workshop on Bayesian Deep Learning and NeurIPS Workshop on Machine Learning for Health.

Adam Golinski, Yee Whye Teh, Frank Wood, Tom Rainforth. Amortized Monte Carlo Integration. Symposium on Advances in Approximate Bayesian Inference.

F.B. Fuchs, O. Groth, A.R. Kosiorek, A. Bewley, M. Wulfmeier, A. Vedaldi, I. Posner. Learning Physics with Neural Stethoscopes. NeurIPS Workshop on Modeling the Physical World: Learning, Perception, and Control.

Jonathan Schwarz, Andrew Joseph Dudzik, Oriol Vinyals, Razvan Pascanu, Yee Whye Teh. Towards a natural benchmark for continual learning. NeurIPS Workshop on Continual Learning.


NeurIPS 2018 Accepted Papers
13 papers coauthored by members of the group have been accepted to the main program of NeurIPS 2018:

V. Perrone, R. Jenatton, M. Seeger, C. Archambeau. Scalable Hyperparameter Transfer Learning. Poster: Thursday Poster Session B, AB #124.

J. Chan, V. Perrone, J. Spence, P. A. Jenkins, S. Mathieson and Yun S. Song. A LikelihoodFree Inference Framework for Population Genetic Data using Exchangeable Neural Networks. Poster: Tuesday Poster Session B, AB #111. Spotlight: Tuesday 16:0516:10, Room 220 CD.

S. Lyddon, S. Walker, C. Holmes. Nonparametric learning from Bayesian models with randomized objective functions. Poster: Thursday Poster Session A, AB #24.

A. L. Caterini, A. Doucet, and D. Sejdinovic. Hamiltonian Variational AutoEncoder. Poster: Wednesday Poster Session A, AB #5.

H. C. L. Law, D. Sejdinovic, E. Cameron, T. C. D. Lucas, S. Flaxman, K. Battle, and K. Fukumizu. Variational Learning on Aggregate Outputs with Gaussian Processes. Poster: Wednesday Poster Session B, AB #19.

H. Lee, J. Lee, S Kim, E Yang and S. J. Hwang. DropMax: Adaptive Variational Softmax. Poster: Tuesday Poster Session A, AB #76.

J. Heo, H. Lee, S. Kim, J. Lee, K. Kim, E. Yang, and S. J. Hwang. UncertaintyAware Attention for Reliable Interpretation and Prediction. Poster: Wednesday Poster Session B, AB #62.

Emilien Dupont. Learning Disentangled Joint Continuous and Discrete Representations. Poster: Thursday Poster Session A, AB #151.

X. Miscouridou, F. Caron, Y. W. Teh. Modelling sparsity, heterogeneity, reciprocity and community structure in temporal interaction data. Poster: Tuesday Poster Session A, AB #1.

S. Webb, A. Golinski, R. Zinkov, N Siddharth, T. Rainforth, Y. W. Teh, F. Wood. Faithful Inversion of Generative Models for Effective Amortized Inference. Poster: Thursday Poster Session A, AB #30.

J. Chen, J. Zhu, Y. W. Teh, T. Zhang. Stochastic Expectation Maximization with Variance Reduction. Poster: Wednesday Poster Session B, AB # 16.

J. Mitrovic, D. Sejdinovic, Y. W. Teh. Causal Inference via Kernel Deviance Measures. Poster: Thursday Poster Session A, AB #9. Spotlight: Thursday 10:3510:40, Room 220 E.

A. Kosiorek, H. Kim, Y. W. Teh, I. Posner. Sequential Attend, Infer, Repeat: Generative Modelling of Moving Objects . Poster: Wednesday Poster Session A, AB #24. Spotlight: Wednesday 10:2510:30, Room 220 E.


4 UAI 2018 Accepted Papers
4 papers coauthored by the OxCSML group members have been accepted to the main program of UAI 2018:

9 ICML 2018 Accepted Papers
9 papers coauthored by the OxCSML group members have been accepted to the main program of ICML 2018:

3 AISTATS 2018 Accepted Papers
3 papers coauthored by the OxCSML group members have been accepted to the main program of AISTATS 2018:

Yee Whye's Breiman Lecture
Slides and video for Yee Whye Teh’s Breiman keynote lecture can be found here.

NIPS Workshops Participation 2017
In addition to 6 papers in the main program of NIPS 2017 and Yee Whye Teh’s Breiman keynote lecture, OxCSML will be represented at several NIPS workshops with the following contributions.

Postdoctoral Research Assistant in Statistical Machine Learning
Applications are invited for a fulltime postdoctoral research assistant in statistical machine learning, fixedterm for up to 2 years. Reporting to Professors Yee Whye Teh and Dino Sejdinovic, the postholder will be a member of the OxCSML (Oxford Computational Statistics and Machine Learning) research group with responsibility for carrying out research on the Oxford  Tencent AI collaborative project on LargeScale Machine Learning. The funds supporting this research project are provided by Tencent AI until October 2020.
Further details are here.
The closing date for applications is 12.00 noon on Monday 8 January 2018. Interviews will be held on Friday 26 January 2018.

Royal Society's Report on Machine Learning
The Royal Society’s Machine Learning Working Group, which included Professor Peter Donnelly and Professor Yee Whye Teh, issued a report entitled Machine Learning: the power and promise of computers that learn by example.
You can also hear about the report in the recent episode of the Talking Machines.

Deep Learning Indaba 2017
An Indaba is a Zulu word for gathering, meeting for the discussion of affairs of the community. Last September, such an Indaba took place in Johannesburg, South Africa under the title “the Deep Learning Indaba”. Its target was to attract students from South Africa and Africa in general and build an understanding of the principles and practice of modern machine learning. Dr Konstantina Palla, postdoctoral fellow in the OxCSML group, was invited to give a tutorial on probabilistic reasoning with a focus on its connections to deep neural networks.

NIPS 2017
Yee Whye Teh will give a Breiman keynote lecture at NIPS 2017 entitled On Bayesian Deep Learning and Deep Bayesian Learning.
6 papers coauthored by the OxCSML group members have been accepted to the main program of NIPS 2017:

Postdoctoral Research Assistant in Statistical Machine Learning
Applications are invited for a fulltime postdoctoral research assistant in statistical machine learning, to work on Bayesian nonparametric methods for recommender systems.
Queries about the post should be addressed to Professor François Caron: caron@stats.ox.ac.uk.
The closing date for applications is 12.00 noon on Friday 18 August 2017.

Diversity in Machine Learning
On 25 May 2017 we will be hosting “Diversity in Machine Learning”, an event for undergraduates, featuring two great speakers, Stefanie Jegelka (MIT) and Raia Hadsell (DeepMind).

"Sparse graphs using exchangeable random measures" by Caron and Fox: RSS discussion paper meeting
“Sparse graphs using exchangeable random measures” by François Caron and Emily Fox will be presented to the Royal Statistical Society at the Discussion Meeting on Wednesday, May 10th at 5pm. More information here.

AISTATS and ICLR 2017 Accepted Papers
Three papers from the group have been accepted at AISTATS 2017 and one paper at ICLR 2017.
The papers are:

Poisson intensity estimation with reproducing kernels by Seth Flaxman, Yee Whye Teh, Dino Sejdinovic

Relativistic Monte Carlo by Xiaoyu Lu, Valerio Perrone, Leonard Hasenclever, Yee Whye Teh, Sebastian Vollmer

Encrypted accelerated least squares regression by Pedro Esperança, Louis Aslett, Chris Holmes

The Concrete Distribution: A Continuous Relaxation of Discrete Random Variables by Chris Maddison, Andriy Mnih, Yee Whye Teh


"What is machine learning?" Animation Launched
In collaboration with Oxford Sparks, machine learning group members Seth Flaxman, Hyunjik Kim, and Prof Yee Whye Teh created a two minute animation answering the question, “What is machine learning?”.

NIPS 2016 participation
Many group members will be at NIPS 2016 presenting work at the main conference and workshops.
 Tamara Fernández will be presenting “Gaussian Processes for Survival Analysis” at the main conference.
 Stefan Webb will be presenting “A Tighter Monte Carlo Objective with Renyi alphaDivergence Measures” at the Bayesian Deep Learning workshop.
 Hyunjik Kim will be presenting “Scalable Structure Discovery in Regression using Gaussian Processes” at the Practical Bayesian Nonparametrics workshop.
 Leonard Hasenclaver, Stefan Webb and Thibaut Lienart will be presenting “Distributed Bayesian Learning with Stochastic Naturalgradient Expectation Propagation and the Posterior Server” at the Advances in Approximate Bayesian Inference and Bayesian Deep Learning workshops.
 Valerio Perrone and Xiaoyu Lu will be presenting “Relativistic Monte Carlo” at the Bayesian Deep Learning workshop.
 Konstantina Palla will be presenting “Bayesian nonparametrics for Sparse Dynamic Networks”, Xiaoyu Lu will be presenting “Tucker Gaussian Process for Regression and Collaborative Filtering”, Qinyi Zhang will be presenting “LargeScale Kernel Methods for Independence Testing” and Jovana Mitrovic will be presenting “Disentangling the Factors of Variation at Initialization In Neural Networks” at the Women in Machine Learning Workshop.

Postdoctoral Research Assistant in Machine Learning
We will have an opening for a twoyear fulltime Postdoctoral Research Assistant, in the areas of kernel methods, Gaussian processes, or probabilistic programming. Queries should be addressed to Professor Yee Whye Teh (y.w.teh@stats.ox.ac.uk).

Posterior Server Software Released
We have released the source code implementing the Posterior Server. It can be found on GitHub at https://github.com/BigBayes/PosteriorServer. The code is written in Julia and is released under the MIT license.

International Prize in Statistics awarded to David Cox
Congratulations to Professor Sir David Cox on being awarded the first ever International Prize in Statistics in recognition of his many extraordinary contributions to statistics and science, especially his introduction of the proportional hazards model in a groundbreaking 1972 paper.

Yee Whye Teh to CoChair ICML2017
Yee Whye will be programme cochair for ICML2017 along with Doina Precup, while Tony Jebara is general chair.

DeepMind Scholarship
We sincerely thank DeepMind for funding a DPhil Scholarship used to support Chris Maddison!

Machine Learning Group Retreat
We went for a group Summer retreat to Trogir, Croatia, 2328 August. It was a great opportunity to update each other on current research projects, and a chance to discuss future projects and initiate collaborations in an informal, relaxed setting. Of course, it was much fun and great bonding time for the group!

Postdoctoral Research Assistant in Machine Learning
Applications are invited for two fulltime ERCfunded Postdoctoral Research Assistants to work on the project ‘BigBayes: Rich, Structured and Efficient Learning of Big Bayesian Models’ in the Department of Statistics.