Talk page

Title:
Margins, perceptrons, and deep networks

Speaker:
Matus Telgarsky

Abstract:
This talk surveys the role of margins in the analysis of deep networks. As a concrete highlight, it sketches a perceptron-based analysis establishing that shallow ReLU networks can achieve small test error even when they are quite narrow, sometimes even logarithmic in the sample size and inverse target error. The analysis and bounds depend on a certain nonlinear margin quantity due to Nitanda and Suzuki, and can lead to tight upper and lower sample complexity bounds. Joint work with Ziwei Ji.

Link:
https://www.ias.edu/video/machinelearning/2020/0326-MatusTelgarsky