Philipp Wacker (until 09/22), Mattes Mollenhauer (from 02/23)
01.03.2022 − 29.02.2024
The project aims at combining novel techniques arising in machine learning with Kalman based filtering approaches for inverse problems. We will investigate subsampling strategies and surrogate enhanced variants to enhance performance in case of high dimensional data spaces and highly complex forward models. Strategies to incorporate constraints on the parameters will be developed by establishing the link to the Bayesian approach to inverse problems.
We pursue two major workstreams in the course of this project:
(1) Theoretical foundations of infinite-dimensional inference. We address the theory of (sub)-Gaussian measures in Hilbert- and Banach spaces  and seek to combine them with recent theory of infinite-dimensional estimation [2, MATH+ IN-8] in order to lay the groundwork for generalised results complementing ideas from workstream (2).
(2) Investigation of specific numerical schemes. We investigate properties of specific data-driven methodologies in Bayesian analysis, inverse problems and numerical analysis in order to understand their behaviour when applied to high-dimensional problems [1,4].
 Blömker, D., Schillings, C., Wacker, P., & Weissmann, S. (2022). Continuous time limit of the stochastic ensemble Kalman inversion: Strong convergence analysis. SIAM Journal on Numerical Analysis, 60(6), 3181-3215.
 Mollenhauer, M., Mücke, N., & Sullivan, T. J. (2022). Learning linear operators: Infinite-dimensional regression as a well-behaved non-compact inverse problem. arXiv preprint arXiv:2211.08875.
 Mollenhauer, M. & Schillings, C (2023). On the concentration of subgaussian vectors and positive quadratic forms in Hilbert spaces. arXiv preprint arXiv:2306.11404.
 Schillings, C., Totzeck, C., & Wacker, P. (2022). Ensemble-based gradient inference for particle methods in optimization and sampling. arXiv preprint arXiv:2209.15420.