EF1 – Extracting Dynamical Laws from Complex Data



Incorporating Locality into Fast(er) Learning

Project Heads

Nicole Mücke

Project Members

Project Duration

2 years

Located at

TU Berlin


Classical stochastic approximation methods such as SGD in reproducing kernel Hilbert spaces are not able to exploit regions of different regularity of the target function. This slows down local convergence dramatically. To overcome this drawback, we propose to analyze localized SGD approaches.

Project Webpages

Selected Publications

[1] Nicole Mücke, Enrico Reiss, Stochastic Gradient Descent in Hilbert Scales: Smoothness, Preconditioning and Earlier Stopping, arXiv:2006.10840 

[2] Nicole MückeGergely NeuLorenzo RosascoBeating SGD Saturation with  Tail-Averaging and Minibatching, 33rd Conference on Neural Information Processing Systems (NeurIPS 2019), Vancouver, Canada, 


[3] Nicole Mücke, Reducing training time by efficient localized kernel regression, Proceedings of Machine Learning Research, PMLR 89:2603-2610, 2019.


Selected Pictures

Please insert any kind of pictures (photos, diagramms, simulations, graphics) related to the project in the above right field (Image with Text), by choosing the green plus image on top of the text editor. (You will be directed to the media library where you can add new files.)
(We need pictures for a lot of purposes in different contexts, like posters, scientific reports, flyers, website,…
Please upload pictures that might be just nice to look at, illustrate, explain or summarize your work.)

As Title in the above form please add a copyright.

And please give a short description of the picture and the context in the above textbox.

Don’t forget to press the “Save changes” button at the bottom of the box.

If you want to add more pictures, please use the “clone”-button at the right top of the above grey box.