Speaker
Description
Recent advances in deep generative modeling have enabled accelerated approaches to sampling complicated probability distributions. In this work, we develop symmetry-equivariant diffusion models to generate lattice field configurations. We build score networks that are equivariant to a range of group transformations and train them using an augmented score matching scheme. By reweighting generated samples, we produce unbiased estimates for observables in scalar $\phi^4$ theory and ${\rm U}(1)$ gauge theory. We extend our framework to sample ${\rm SU}(N)$ degrees of freedom by adapting the score matching technique and the reverse diffusion process to the group manifolds. Our trained models faithfully reproduce the target densities for several toy ${\rm SU}(2)$ theories, which marks a step towards simulating full ${\rm SU}(N)$ gauge theory on the lattice.