EF1 – Extracting Dynamical Laws from Complex Data

Project

EF1-4

Extracting Dynamical Laws by Deep Neural Networks:
A Theoretical Perspective

Project Heads

Jens Eisert, Frank Noé, Barbara Zwicknagl

Project Members

Alex Goeßmann (TU) 

Project Duration

01.01.2019 – 31.12.2021

Located at

TU Berlin

Motivation

A central desire for any scientific model is an assessing estimation of its limitation. In recent years, tools for automated model discovery from given training data have been developed in the area of supervised machine learning and tensor optimization. However, such methods lack of a sophisticated theoretical foundation, which would provide estimates for the limitations of such models.

Statistical learning theory quantifies the limitation of a trained model in terms of the generalization error and provides bounds of that quantity when uniform concentration events occur. However, the statistical properties of random tensors and neural networks are poorly understood, which hinders the application of such tools from learning theory.

We aim to build the bridge between both scientific areas by analysing the statistics of random tensors and neural networks. In order to formulate the consequences for the design of automated model discovery, we anticipate to assess our findings by broad numerical studies.

Approach

This project will develop a profound theoretical understanding of neural and tensor networks for extracting dynamical laws from complex data, focusing predominantly on recovery guarantees. To this end we investigate the stochastic properties of neural and tensor networks and provide probability bounds for an exact model recovery. We aim to derive various sample complexity bounds, with deep implications for the design of the learning architecture.

During the first year of the funding period, we have studied the case of linear activation functions and provided a tensor network regression ansatz for learning non-linear dynamical laws. Furthermore, similarities between the field of function identification and dynamical law extraction have been analyzed. In current projects, we bound the sample complexity of identifying tensors from data, where we apply concepts from compressed sensing and the chaining theory of stochastic processes.

We furthermore investigated the uniform concentration of ReLU networks and bounded the sample complexities to guarantee small generalization errors when optimizing least square risks on shallow neural networks.

 

Selected Publications

A. Goeßmann, M. Götte, I. Roth, R. Sweke, G. Kutyniok and J. Eisert: “Tensor network approaches for learning non-linear dynamical laws”, NIPS 2020, First Workshop on Quantum Tensor Networks

A. Goeßmann and G. Kutyniok, “The Restricted Isometry of ReLU Networks: Generalization through Norm Concentration”, arXiv:2007.00479, to be published (July 2020)

Please insert any kind of pictures (photos, diagramms, simulations, graphics) related to the project in the above right field (Image with Text), by choosing the green plus image on top of the text editor. (You will be directed to the media library where you can add new files.)
(We need pictures for a lot of purposes in different contexts, like posters, scientific reports, flyers, website,…
Please upload pictures that might be just nice to look at, illustrate, explain or summarize your work.)

As Title in the above form please add a copyright.

And please give a short description of the picture and the context in the above textbox.

Don’t forget to press the “Save changes” button at the bottom of the box.

If you want to add more pictures, please use the “clone”-button at the right top of the above grey box.