Talk page

Title:
Shifted divergences for sampling, privacy, and beyond

Speaker:
Jason Altschuler

Abstract:
Shifted divergences provide a principled way of making information theoretic divergences (e.g. KL) geometrically aware via optimal transport smoothing. In this talk, I will argue that shifted divergences provide a powerful approach towards unifying optimization, sampling, privacy, and beyond. For concreteness, I will demonstrate these connections via three recent highlights. (1) Characterizing the differential privacy of Noisy-SGD, the standard algorithm for private convex optimization. (2) Characterizing the mixing time of the Langevin Algorithm to its stationary distribution for log-concave sampling. (3) The fastest high-accuracy algorithm for sampling from log-concave distributions. A recurring theme is a certain notion of algorithmic stability, and the central technique for establishing this is shifted divergences. Based on joint work with Kunal Talwar, and with Sinho Chewi.

Link:
https://mathtube.org/lecture/video/shifted-divergences-sampling-privacy-and-beyond

Workshop:
Mathtube- Kantorovich Initiative Seminar