26–30 Jun 2023
ECT*
Europe/Rome timezone

Global and local symmetries in neural networks

27 Jun 2023, 11:00
25m
Aula Renzo Leonardi (ECT*)

Aula Renzo Leonardi

ECT*

Strada delle Tabarelle 286, I-38123 Villazzano (Trento)

Speaker

Daniel Schuh (TU Wien)

Description

Incorporating symmetries into neural network architectures has become increasingly popular. Convolutional Neural Networks (CNNs) leverage the assumption of global translational symmetry in the data to ensure that their predicted observable transforms properly under translations. Lattice gauge equivariant Convolutional Neural Networks (L-CNNs) [1] are designed to respect local gauge symmetry, which is an essential component in lattice gauge theories. This property makes them effective in approximating gauge covariant functions on a lattice. Since many observables exhibit additional global symmetries to translations, an extension of the L-CNN to a more general symmetry group, including e.g. rotations and reflections [2], is desirable.

In this talk, I will present some of the essential L-CNN layers and motivate why they can approximate gauge equivariant functions on a lattice. I will comment on the robustness of such a network against adversarial attacks along gauge orbits in comparison to a traditional CNN. Then, I will provide a geometric formulation of L-CNNs and show how convolutions in L-CNNs arise as a special case of gauge equivariant neural networks on $\mathrm{SU}(N)$ principal bundles. Finally, I will discuss how the L-CNN layers can be generalized to respect global rotations and reflections in addition to translations.

[1] M. Favoni, A. Ipp, D. I. Müller, D. Schuh, Phys. Rev. Lett. 128 (2022), 032003, [arXiv:2012.12901]
[2] J. Aronsson, D. I. Müller, D. Schuh [arXiv:2303.11448]

Primary author

Daniel Schuh (TU Wien)

Co-authors

Andreas Ipp (TU Wien) David Müller (TU Wien) Jimmy Aronsson (Chalmers University of Technology) Matteo Favoni (TU Vienna)

Presentation materials