Abstract: A natural approach to understand overparameterized deep neural networks is to ask if there is some kind of natural limiting behavior when the number of neurons diverges. We present a rigorous limit result of this kind for networks with complete connections and “random-feature-style” first and last layers. Specifically, we show that network weights are approximated by certain “ideal particles” whose distribution and dependencies are described by McKean-Vlasov mean-field model. We will present the intuition behind our approach; sketch some of the key technical challenges along the way; and connect our results to some of the recent literature on the topic.
Date: Aug 20, 2020 at 15:00:00 h
Venue: Modalidad Vía Online.
Speaker: Roberto Imbuzeiro Oliveira
Affiliation: (IMPA)
Coordinator: Daniel Remenik & Joaquín Fontbona
Venue: Modalidad Vía Online.
Speaker: Roberto Imbuzeiro Oliveira
Affiliation: (IMPA)
Coordinator: Daniel Remenik & Joaquín Fontbona
Abstract:
PDF
Posted on Aug 13, 2020 in Seminario Probabilidades CMM, Seminars



Noticias en español
