Martes 21 de Noviembre 2023 a las 12:00 hrs. Sala de Seminarios CMM piso 7, Beauchef 851

Speaker: Mircea Petrache (PUC)

Title: Approximate and partial symmetries in Deep Learning

Abstract: The image of a rotated cat still represents a cat: while this simple rule seems obvious to a human, it is not obvious to neural networks, which separately «learn» each new rotation of the same image. This applies to different groups of symmetries for images, graphs, texts, and other types of data. Implementing «equivariant» neural networks that respect symmetries, reduces the number of learned parameters, and helps improve their generalization properties outside the training set. On the other hand, in networks that «identify too much», that is, where we impose too many symmetries, the error begins to increase, due to not respecting the data. In work with S. Trivedi, (NeurIPS 2023), we quantify this tradeoff, which allows to define the optimal amount of symmetries in learning models. I will give an introduction to classical learning theory bounds, and our extension of the ideas to the study of «partial/approximate equivariance». In passing, I’ll describe some possible directions for working with partial symmetries in specific tasks.

El link para acceder por zoom: