Chalmers Division of Algebra and Geometry seeks a PhD in Theoretical Deep Learning: Symmetries in Neural Networks.
Project description
In many deep learning applications ranging from medical image analysis to quantum chemistry, the symmetries of the underlying problem pose an important constraint. These can be taken into account by modifying the architecture of the network (yielding so-called equivariant neural networks) or by learning the symmetry from the data (data augmentation). In this project, the aim is to construct a theoretical framework for data augmentation of deep neural networks. During the last year, the debate around the trade-offs between data augmentation and manifest equivariance has intensified, with data augmentation gaining popularity in notable domains such as protein design and quantum chemistry. The goal of this project is to contribute to this debate with a theoretical framework and novel algorithms.
This project will be conducted in close collaboration with Axel Flinth from Umeå University (who will act as WASP co-supervisor) and Pan Kessel from the company Genentech Roche in Basel (who will act as industrial supervisor).
Research environment
In the GAPinDNNs research group on Geometry, Algebra and Physics in Deep Neural Networks at the division of Algebra and Geometry, we conduct research on deep learning, in particular geometric deep learning and mathematics for AI.
The expansion of Artificial Intelligence (AI), in the broad sense, is one of the most exciting developments of the 21st century. This progress opens up many possibilities but also poses challenges.
Wallenberg AI, Autonomous Systems and Software Program (WASP) is Sweden’s largest individual research program ever, a major national initiative.
We are taking part in this program through the following research project: How Do Neural Networks Learn Symmetries? Augmentation, Randomization and Equivariance.