PhD position position in mathematics, focusing on geometric deep learning at the Department of Mathematics and Mathematical Statistics at Umeå University.
Project description and tasks
Deep learning models, particularly deep convolutional neural networks (CNNs), have enjoyed tremendous success on an impressive number of complex problems. However, a fundamental understanding of the mathematical descriptions of the models and their extensions to non-flat data are still lacking, presenting an exciting research problem spanning several areas such as differential geometry, numerical analysis, and dynamical systems.
A promising approach, referred to as geometric deep learning, is to account for the geometry of the input data by making the networks equivariant with respect to the symmetries of the data. This means that a transformation of the input produces a corresponding transformation of the output. Making such geometric structures manifest amounts to incorporating prior knowledge of the system to facilitate learning.
Another development explores the connection to differential equations and dynamical systems in the limit of infinitely deep networks. Formulated in terms of the dynamics propagating information through the network, the learning problem becomes amenable to powerful numerical techniques for differential equations and a rich theory of dynamical systems.
The project aims to unify the two approaches and develop the mathematical foundations of the emerging field of neural differential equations by exploring the connection to geometric deep learning. The objective is to develop a manifestly geometric formulation of equivariant neural differential equations, unifying symmetries of the data manifold and symmetries of the differential equations, and to investigate the properties and applications of such a formulation.