Short Courses
-
Justin Salez, Université Paris-Dauphine & PSL - The cutoff phenomenon for Markov chains
The cutoff phenomenon is an abrupt phase transition in the convergence to equilibrium of certain Markov chains, in the limit where the number of states tends to infinity. Discovered in the 80’s by Aldous, Diaconis and Shahshahani in the context of card shuffling, it has since then been independently observed in a variety of contexts, including random walks on graphs and groups, high-temperature spin glasses, or interacting particle systems. Nevertheless, a general theory is still missing, and identifying the general mechanisms underlying this mysterious phenomenon remains one of the most fundamental problems in the area of mixing times. The goal of this mini-course is to provide a self-contained introduction to this fascinating question, illustrated with many examples and a selection of open problems. I will also present a new approach based on entropy and curvature, which has recently led to a systematic proof of cutoff for a broad class of chains. -
Sophie Langer, University of Twente - On the statistical theory of deep learning
Although Deep Learning has already reached the state of the art in many machine learning tasks, the underlying understanding of the methodology is still in its infancy. Most applications rely on intuition and trial and error. So far, we have only a limited understanding of
- Why we can reliably optimize non-convex objectives?
- How expressive our architectures are with respect to the class of hypotheses they describe?
- Why most complex models generalize to unseen examples when we use data sets orders of magnitude smaller than what classical statistical learning theory considers sufficient?