Bayesian Deep Learning 2-year postdoc position

Julyan Arbel

Laboratory: Statify team, Inria and Laboratoire Jean Kuntzmann

Inria Grenoble Rhône-Alpes, 655 Avenue de l’Europe, 38330 Montbonnot-Saint-Martin

Phone: 06 43 28 75 03, Email: julyan.arbel@inria.fr

Webpage : https://www.julyanarbel.com/

Judith Rousseau

Laboratory: Department of Statistics, Oxford University, UK

Phone: +44 (0)1865 282 865, Email: judith.rousseau@stats.ox.ac.uk

Webpage : http://www.stats.ox.ac.uk/~rousseau/

Funding
Judith Rousseau’s ERC Advanced Grant 2018, “General Theory for Big Bayes” and Grenoble IDEX.

Application deadline: June 17
Starting date: Anytime before December 2020

Interested applicants should write to us with: a letter of interest, CV, and should require two recommendation letters.

Context
Bayesian deep learning brings together two of the most important machine learning paradigms: Bayesian inference and deep learning. On the one hand, Bayesian learning provides a theoretically sound framework to formalise the estimation of the architecture and the parameters of deep neural network models. On the other hand, deep learning offers new tools in Bayesian modelling, e.g. to learn flexible nonparametric priors or computationally efficient posterior distribution approximations.

State of the art
The field of machine learning has recently been much impacted by deep learning. Deep neural networks are now at the basis of the state-of-the-art in computer vision, natural language processing, to cite just a few. While very effective, these models are computationally costly and require large quantities of data for their many parameters to be accurately estimated. Bayesian statistics offers a theoretically well-grounded framework to reason about uncertainty, and it is one of the cornerstones of modern machine learning. At the same time, the theory at the basis of deep neural networks is not yet very well understood and its grounds must be laid out. Although the interaction between these two learning paradigms is relatively under-explored, there is a great potential of cross-fertilisation between the two.

Objectives
The goal of this project is to contribute new theory and practical techniques that lie at the interface of the deep learning and the Bayesian paradigms, by investigating how these paradigms can mutually benefit each other. More specifically, we have two methodological objectives: 1) leveraging successful principles of deep learning, such as hierarchies, convolutions, to build posterior distributions leading to rich data representations accounting for uncertainty quantification, and 2) making such posterior distributions practical in the large-scale regime by making use of approximations. This dichotomy yields challenges with both theoretical and algorithmic aspects, such as: building more interpretable parameter priors, scaling-up Bayesian deep learning algorithms, and gaining theoretical insight and principled uncertainty quantification for deep learning.

Results
The main expected theoretical breakthrough is a principled uncertainty quantification in the use of deep neural networks. This will improve our mathematical understanding of these models, but it should also help us design better algorithms. Specifically, we want to develop a new methodology that may change the way Bayesian deep neural networks are currently designed, make them more generic, simpler to use, and faster to train.

Keywords
Bayesian Machine Learning, Deep Learning, Predictive models, Computational statistics.

Recent related publications
Arbel, J. (2019). Bayesian Statistical Learning and Applications. HDR thesis, Univ. Grenoble-Alpes.
Hayou, S., Doucet, A., and Rousseau, J. (2019). On the impact of the activation function on deep neural networks training. In International Conference on Machine Learning.
Vladimirova, M., Verbeek, J., Mesejo, P., and Arbel, J. (2019). Understanding Priors in Bayesian Neural Networks at the Unit Level. In International Conference on Machine Learning.
Wilson, A. G. (2020). The case for Bayesian deep learning. arXiv preprint arXiv:2001.10995.

Environment
Inria is the French national research institute for digital science and technology. World-leading research and technological innovation are an integral part of its DNA. Inria’s 3,500 researchers and engineers put their passion for digital technology to work in nearly 200 project teams, most of which are joint teams with our academic partners, including major research universities and the CNRS. They explore new fields, often in collaboration with different disciplines and industrial partners, with the aim of meeting ambitious challenges.
https://www.inria.fr/en/centre-inria-grenoble-rhone-alpes

The University of Oxford is one of the world’s leading academic institutions and one of the oldest, with a unique heritage that dates back to the 11th century. Today its reputation, like its longevity, reflects a deep and abiding commitment to excellence in every area of teaching and research.  As a result of that commitment, the University enriches international, national and regional communities in countless ways: through the fruits of its research and the skills of its alumni, through sharing academic and cultural resources, and by publishing outstanding materials in many formats for learning and study.
http://www.stats.ox.ac.uk/