Incubator Projects

Project

IN-8

Infinite-Dimensional Supervised Least Squares Learning as a Noncompact Regularized Inverse Problem

Project Heads

Péter Koltai

Project Members

Mattes Mollenhauer

Project Duration

01.01.2022 – 31.12.2022

Located at

FU Berlin

Description

We propose an approach to spectral regularization algorithms for kernel-based supervised learning with in finite-dimensional response variables. Recent research shows that this scenario enjoys widespread practical use. However, virtually no results exist due to the mathematical complexity compared to the finite-dimensional learning settings investigated so far.

 

A powerful approach to derive convergence rates for supervised learning problems can be formulated in terms of spectral regularization techniques for inverse problems (Caponnetto and De Vito, 2007). However, virtually all available literature on this topic requires the underlying linear operator describing the inverse problem to be compact. However, recent applications motivate to investigate the scenario of infinite-dimensional supervised learning problems (Singh et al., 2019; Mollenhauer and Koltai, 2020). We have shown that in this case, the underlying linear operator is typically not compact, which may have a drastical impact on the behaviour of the used regularization techniques.

 

The goal of this project is to derive convergence results which apply to the infinite-dimensional scenario by combining techniques from spectral perturbation, statistical inverse problems and concentration of measure in Hilbert/Banach spaces.

 

A specific application of this abstract framework is the kernel-based approximation of Markov transition operators, which is widely used in practical disciplines for the data-driven analysis of dynamical systems (Klus et al., 2020). 

Project Webpages

Selected Publications

Blanchard, G. and Mücke, N. (2018). Optimal rates for regularization of statistical inverse learning problems. Foundations of Computational Mathematics, 18:971–1013.

 

Caponnetto, A. and De Vito, E. (2007). Optimal rates for the regularized least-squares algorithm. Foundations of Computational Mathematics, 7(3):331–368.

 

Klus, S., Schuster, I., and Muandet, K. (2020). Eigendecompositions of transfer operators in reproducing kernel Hilbert spaces. Journal of Nonlinear Science, 30(1):283–315.

 

Mollenhauer, M. and Koltai, P. (2020). Nonparametric approximation of conditional expectation operators. arXiv preprint arXiv:2012.12917.

 

Park, J. and Muandet, K. (2020). A measure-theoretic approach to kernel conditional mean embeddings. In Advances in Neural Information Processing Systems, volume 33.

 

Singh, R., Sahani, M., and Gretton, A. (2019). Kernel instrumental variable regression. In Advances in Neural Information Processing Systems, volume 32.

Selected Pictures

Please insert any kind of pictures (photos, diagramms, simulations, graphics) related to the project in the above right field (Image with Text), by choosing the green plus image on top of the text editor. (You will be directed to the media library where you can add new files.)
(We need pictures for a lot of purposes in different contexts, like posters, scientific reports, flyers, website,…
Please upload pictures that might be just nice to look at, illustrate, explain or summarize your work.)

As Title in the above form please add a copyright.

And please give a short description of the picture and the context in the above textbox.

Don’t forget to press the “Save changes” button at the bottom of the box.

If you want to add more pictures, please use the “clone”-button at the right top of the above grey box.