Michael Hintermüller, Kostas Papafitsoros, Carsten Gräser
01.04.2022 − 31.03.2025
Multilevel methods for training nonsmooth artificial neural networks will be developed, analyzed and implemented. Taylored refining and coarsening strategies for the optimization parameters in terms of number of neurons, layers and the network architecture will be studied. Efficient nonsmooth optimization methods will be introduced and used to treat the level-specific problems. The framework will be applied to problems that have a multilevel structure: learned regularization in image processing, neural network-based PDE solvers and learning-informed physics. Software will be developed and made publicly available.