Speaker
Description
Normalizing Flows are a class of deep generative models recently proposed as a promising alternative to conventional Markov Chain Monte Carlo in lattice field theory simulations. Such architectures provide a new way to avoid the large autocorrelations that characterize Monte Carlo simulations close to the continuum limit. In this talk we explore the novel concept of Stochastic Normalizing Flows (SNFs), in which neural-network layers are combined with out-of-equilibrium stochastic updates: in particular, we show how SNFs share the same theoretical framework of Monte Carlo simulations based on Jarzynski's equality. The latter is a well-known result in non-equilibrium statistical mechanics which proved to be highly efficient in the computation of free-energy differences in lattice gauge theories. We discuss the most appealing features of this extended class of generative models using numerical results in the $\phi^4$ scalar field theory in 2 dimensions.