Emerging Field 1 “Extracting Dynamical Laws from Complex Data”
This Emerging Field aims to develop novel methods, through combining machine learning and mathematical process simulation, which are able to derive effective dynamical laws from data. As a result, we anticipate to obtain models which generate understanding and physical insights, and can be simulated efficiently, resulting in unprecedented speedups compared to the often intractable classical direct simulation methods such as for the timedependent Schrödinger equation in quantum processes.
This program faces several challenges. First, current machine learning methods are often blackbox approaches, which are not sufficient for scientific simulation and measurement data. And, second, current machine learning methods study mostly static, stationary, and complete data, however scientific data is often dynamic, nonstationary, incomplete, multimodal, and multiscale.
Consequently, the projects within this Emerging Field focus either on the development of a theory for machine learning, in particular, deep learning, or approach this problem complex from the application side. Moreover, the projects are typically characterized by a high degree of interdisciplinarity as well as by utilizing a combination of numerous mathematical areas.
Scientists in Charge: KlausRobert Müller, Sebastian Pokutta, Markus Reiß

 EF110: Kernel Ensemble Kalman Filter and Inference
Péter Koltai, Nicolas Perkowski
 EF111: Quantum Advantages in Machine Learning
KlausRobert Müller, Jens Eisert
 EF112: Learning Extremal Structures in Combinatorics
Sebastian Pokutta, Tibor Szabó
 EF114: Sparsity and SampleSize Efficiency in Structured Learning
Sebastian Pokutta
 EF115: Robust Multilevel Training of Artificial Neural Networks
Michael Hintermüller, Carsten Gräser
 EF116: Quiver Representations in Big Data and Machine Learning
Alexander Schmitt
 EF117: DataDriven Robust Model Predictive Control under Distribution Shift
JiaJie Zhu, Michael Hintermüller
 EF118: ManifoldValued Graph Neural Networks
Christoph von Tycowicz, Gabriele Steidl
 EF119: Machine Learning Enhanced Filtering Methods for Inverse Problems
Claudia Schillings
 EF120: Uncertainty Quantification and Design of Experiment for DataDriven Control
Claudia Schillings
 EF121: Scaling up Flag Algebras in Combinatorics
Sebastian Pokutta, Christoph Spiegel
 EF122: Bayesian Optimization and Inference for Deep Networks
Claudia Schillings, Vladimir Spokoiny
 EF123: On a FrankWolfe Approach for AbsSmooth Optimization
Sebastian Pokutta, Andrea Walther, Zev Woodstock
 EF124: Expanding MerlinArthur Classifiers Interpretable Neural Networks through Interactive Proof Systems
Sebastian Pokutta, Stephan Wäldchen
 EF125: Wasserstein Gradient Flows for Generalised Transport in Bayesian Inversion
Martin Eigel, Claudia Schillings, Gabriele Steidl
Successfully completed projects:

 EF11: Quantifying Uncertainties in Explainable AI
Gitta Kutyniok, KlausRobert Müller, Wojciech Samek
 EF12: Quantum Kinetics
KlausRobert Müller, Frank Noé
 EF13: Approximate Convex Hulls With Bounded Complexity
Michael Joswig, KlausRobert Müller
 EF14: Extracting Dynamical Laws by Deep Neural Networks: A Theoretical Perspective
Jens Eisert, Frank Noé, Barbara Zwicknagl
 EF15: On Robustness of Deep Neural Networks
Christian Bayer, Peter Karl Friz
 EF16: Graph Embedding for Analyzing the Microbiome
Tim Conrad, Stefan Klus, Gregoire Montavon
 EF17: Quantum Machine Learning
Jens Eisert, KlausRobert Müller
 EF18: Incorporating Locality into Fast(er) Learning
Nicole Mücke
 EF19: Adaptive Algorithms through Machine Learning: Exploiting Interactions in Integer Programming
Ambros Gleixner, Sebastian Pokutta
 EF113: Stochastic and Rough Aspects in Deep Neural Networks
Christian Bayer, PeterKarl Friz