Functional Fluctuations of Fully Connected Neural Networks
We will present some novel probabilistic bounds (proved in Favaro, Hanin, Marinucci, Nourdin and Peccati (2023)), providing an estimate of the discrepancy between the distribution of a fully connected neural network with initial Gaussian weights, and that of its infinite-width Gaussian limit. Our techniques are based on some explicit representation of the 2-Wasserstein distance between Gaussian random elements on Hilbert spaces, as well as on embedding arguments.
Area: CS24 - Neural Networks at initialization (Michele Salvi)
Keywords: Neural Networks; Functional Limit Theorems; Probabilistic Approximations
Please Login in order to download this file