Skip to content Skip to navigation

Large N expansion for wide neural networks

January 27, 2020 - 2:00pm to 4:00pm
Varian Physics - Room 355


Google Research Scientist Guy Gur-Ari will give the Stanford Institute for Theoretical Physics (SITP) Monday Colloquium.

AbstractUnderstanding the asymptotic behavior of wide neural networks is of considerable interest in machine learning. In the large width limit, we keep the number of layers of a network fixed and send the number of neurons in a layer to infinity. We present a general method for deriving scaling laws of correlators in this limit. The method is an adaptation of 't Hooft's large N expansion, where the number of neurons plays the role of N. We apply our method to study training dynamics during gradient descent. We improve on existing results in the strict large width limit, and compute the 1/N correction to network evolution. The talk is based on

Event Sponsor: 
the Stanford Institute for Theoretical Physics
Contact Email:
Contact Phone: