Effective Theory of Deep Neural Networks

Speaker
Sho Yaida
Date
Mon October 9th 2023, 2:00pm
Affiliation
Meta
Event Sponsor
Stanford Institute for Theoretical Physics
Location
Varian 355

Large neural networks perform extremely well in practice, providing the backbone of modern machine learning. The goal of this talk is to provide a blueprint for theoretically analyzing these large models from first principles. In particular, we'll overview how the statistics and dynamics of deep neural networks drastically simplify at large width and become analytically tractable. In so doing, we'll see that the idealized infinite-width limit is too simple to capture several important aspects of deep learning such as representation learning. To address them, we'll step beyond the idealized limit and systematically incorporate finite-width corrections.