In this talk we provide explicit upper bounds on some distances between the (law of the) output of a random Gaussian neural network and (the law of) a random Gaussian vector. Our main results concern deep random Gaussian neural networks, with a rather general activation function. The upper bounds show how the widths of the layers, the activation function and other architecture parameters affect the Gaussian approximation of the output. Our techniques, relying on Stein's method and integration by parts formulas for the Gaussian law, yield estimates on distances which are indeed integral probability metrics, and include the convex distance. This latter metric is defined by testing against indicator functions of measurable convex sets, and so allows for accurate estimates of the probability that the output is localized in some region of the space. Such estimates have a significant interest both from a practitioner's and a theorist's perspective.

Normal approximation of random Gaussian neural networks

Nicola Apollonio;Daniela De Canditiis;Giovanni Franzina;Paola Stolfi;Giovanni Luca Torrisi
2024

Abstract

In this talk we provide explicit upper bounds on some distances between the (law of the) output of a random Gaussian neural network and (the law of) a random Gaussian vector. Our main results concern deep random Gaussian neural networks, with a rather general activation function. The upper bounds show how the widths of the layers, the activation function and other architecture parameters affect the Gaussian approximation of the output. Our techniques, relying on Stein's method and integration by parts formulas for the Gaussian law, yield estimates on distances which are indeed integral probability metrics, and include the convex distance. This latter metric is defined by testing against indicator functions of measurable convex sets, and so allows for accurate estimates of the probability that the output is localized in some region of the space. Such estimates have a significant interest both from a practitioner's and a theorist's perspective.
2024
Istituto per le applicazioni del calcolo - IAC - Sede Secondaria Napoli
Neural Network
File in questo prodotto:
File Dimensione Formato  
presentation_DeCanditiisDaniela.pdf

non disponibili

Tipologia: Altro materiale allegato
Licenza: Altro tipo di licenza
Dimensione 500.18 kB
Formato Adobe PDF
500.18 kB Adobe PDF   Visualizza/Apri   Richiedi una copia

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.14243/510387
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
social impact