Christian Bayer, Peter Friz
Nikolas Tapia (TU / WIAS)
01.01.2019 – 31.12.2020
TU Berlin / WIAS
Deep residual neural networks [He, Zhang, Ren, Sun 2016] are an important recent class of deep neural networks. Its incremental nature invites interpretation as Euler discretization of differential equations [Haber and Ruthotto 2017]. We suggest a far-reaching generalization using signatures and rough path analysis. In particular, we develop a new discrete rough path framework geared at difference equations, which allow us to obtain titght stability estimates for the output of a residual neural network in terms of the weight matrices.
The weights are taken by an actual trained network from He et al.
The picture shows that even in this case choosing p>1 can improve a priori bounds.
One observes that due to the high variability of the trained weights, a priori knowledge of the deviation is better if we are allowed to choose p>1.
Please insert any kind of pictures (photos, diagramms, simulations, graphics) related to the project in the above right field (Image with Text), by choosing the green plus image on top of the text editor. (You will be directed to the media library where you can add new files.)
(We need pictures for a lot of purposes in different contexts, like posters, scientific reports, flyers, website,…
Please upload pictures that might be just nice to look at, illustrate, explain or summarize your work.)
As Title in the above form please add a copyright.
And please give a short description of the picture and the context in the above textbox.
Don’t forget to press the “Save changes” button at the bottom of the box.
If you want to add more pictures, please use the “clone”-button at the right top of the above grey box.