Dominic Richards

Dominic Richards

Statistical Learning Theory, Optimisation, Implicit Regularisation

I am a DPhil at the University of Oxford supervised by Yee Whye Teh and Patrick Rebeschini. I am interested in researching the statistical properties of models that are implicitly regularised by gradient descent. More recently, I have been studying the statistical performance of the least norm solution in the context of high-dimension least squares regression.

Publications

2021

  • D. Richards , J. Mourtada , L. Rosasco , Asymptotics of Ridge(less) Regression under General Source Condition , in Proceedings of The 24th International Conference on Artificial Intelligence and Statistics, 2021, vol. 130, 3889–3897.
  • D. Richards , M. Rabbat , Learning with Gradient Descent and Weakly Convex Losses , in Proceedings of The 24th International Conference on Artificial Intelligence and Statistics, 2021, vol. 130, 1990–1998.

2020

  • D. Richards , P. Rebeschini , Graph-Dependent Implicit Regularisation for Distributed Stochastic Subgradient Descent, Journal of Machine Learning Research, vol. 21, no. 34, 1–44, 2020.
  • D. Richards , P. Rebeschini , L. Rosasco , Decentralised Learning with Random Features and Distributed Gradient Descent, in Proceedings of the 37th International Conference on Machine Learning, 2020, vol. 119, 8105–8115.

2019

  • D. Richards , P. Rebeschini , Optimal Statistical Rates for Decentralised Non-Parametric Regression with Linear Speed-Up, in Advances in Neural Information Processing Systems 32, H. Wallach, H. Larochelle, A. Beygelzimer, F. d’ Alché-Buc, E. Fox, and R. Garnett, Eds. Curran Associates, Inc., 2019, 1216–1227.