Large N expansion for wide neural networks

Date
Mon January 27th 2020, 2:00 - 4:00pm
Event Sponsor
the Stanford Institute for Theoretical Physics
Location
Varian Physics - Room 355
Large N expansion for wide neural networks

Google Research Scientist Guy Gur-Ari will give the Stanford Institute for Theoretical Physics (SITP) Monday Colloquium.

AbstractUnderstanding the asymptotic behavior of wide neural networks is of considerable interest in machine learning. In the large width limit, we keep the number of layers of a network fixed and send the number of neurons in a layer to infinity. We present a general method for deriving scaling laws of correlators in this limit. The method is an adaptation of 't Hooft's large N expansion, where the number of neurons plays the role of N. We apply our method to study training dynamics during gradient descent. We improve on existing results in the strict large width limit, and compute the 1/N correction to network evolution. The talk is based on https://arxiv.org/abs/1909.11304.

Contact Phone Number