Speaker
Dr
Anindita Maiti
(Perimeter Institute for Theoretical Physics)
Description
The key to the performance of ML algorithms is an ability to segregate relevant features in input datasets from the irrelevant ones. In a setup where data features play the role of an energy scale, we develop a Wilsonian RG framework to integrate out unlearnable modes associated with the Neural Network Gaussian Process (NNGP) kernel, in the regression context. In this scenario, Gaussian feature modes result in a universal flow of the ridge parameter, whereas, non-Gaussianities lead to rich input-dependent RG flows. This framework goes beyond the usual analogies between RG flows and learning dynamics, and offers potential improvements to our understanding of feature learning and universality classes of models.
Primary author
Dr
Anindita Maiti
(Perimeter Institute for Theoretical Physics)
Co-authors
Dr
Jessica N. Howard
(Kavli Institute for Theoretical Physics, Santa Barbara, CA USA)
Prof.
Ro Jefferson
(Institute for Theoretical Physics, and Department of Information and Computing Sciences Utrecht University, Princetonplein 5, 3584 CC Utrecht, The Netherlands)
Prof.
Zohar Ringel
(The Racah Institute of Physics, The Hebrew University of Jerusalem)