**Project Heads**

*Stefan Klus, Tim Sullivan*

**Project Members**

Ilja Klebanov (ZIB)

**Project Duration**

01.01.2019 – 31.12.2020

**Located at**

ZIB

The application domain of this project is to model customer demand and control item prices in an e-commerce setting using both reproducing kernel Hilbert space (transfer) operator approaches and models inspired by recurrent neural networks. The collaboration partner Zalando will use the developed prototypical methods to improve supply planning and pricing, taking into account real-world constraints.

The underlying mathematical challenges involve the statistical analysis and optimal control of time series in high-dimensional non-linear spaces. The embedding of these objects into appropriate reproducing kernel Hilbert feature spaces offers a way to faithfully linearise these problems and make them amenable to computation.

The first publication [1] stemming from this project is “A rigorous theory of conditional mean embeddings”. Conditional mean embeddings (CMEs) have proven themselves to be a powerful tool in many machine learning applications. They allow the efficient conditioning of probability distributions within the corresponding reproducing kernel Hilbert spaces (RKHSs) by providing a linear-algebraic relation for the kernel mean embeddings of the respective joint and conditional probability distributions. Both centred and uncentred covariance operators have been used to define CMEs in the existing literature. In this paper, we develop a mathematically rigorous theory for both variants, discuss the merits and problems of each, and significantly weaken the conditions for applicability of CMEs. In the course of this, we demonstrate a beautiful connection to Gaussian conditioning in Hilbert spaces.

This first paper [1] leads naturally to the study of the linear conditional expectation (LCE) in Hilbert spaces, e.g. spaces of time series. The LCE provides a best linear (or rather, affine) estimate of the conditional expectation and hence plays an important rôle in approximate Bayesian inference, especially the Bayes linear approach. In our second paper [2], We have established the analytical properties of the LCE in an infinite-dimensional Hilbert space context. In addition, working in the space of affine Hilbert–Schmidt operators, we establish a regularisation procedure for this LCE. As an important application, we obtain a simple alternative derivation and intuitive justification of the CME formula.

**Selected Publications
**

[1] I. Klebanov, I. Schuster, T. J. Sullivan. “A rigorous theory of conditional mean embeddings.” *SIAM J. Math. Data Sci.* 2(3):583–606, 2020. doi:10.1137

[2] I. Klebanov, B. Sprungk, and T. J. Sullivan. “The linear conditional expectation in Hilbert space.” *Bernoulli*, 2021. doi: 10.3150/20-BEJ1308 arXiv:2008.12070

**Selected Pictures
**

While conditioning of the probability distributions in the original spaces* X, Y* is a possibly complicated, non-linear problem, the corresponding formula for their kernel mean embeddings reduces to elementary linear algebra – a common guiding theme when working with reproducing kernel Hilbert spaces.

*Left:* Comparison of the conditional expectation function and the linear conditional expectation function (LCEF). The contour plot shows the joint probability density. *Right:* For an empirical probability distribution (e.g. given by data), the LCEF coincides with the solution to the linear least squares regression problem.

Please insert any kind of pictures (photos, diagramms, simulations, graphics) related to the project in the above right field (Image with Text), by choosing the green plus image on top of the text editor. (You will be directed to the media library where you can add new files.)

(We need pictures for a lot of purposes in different contexts, like posters, scientific reports, flyers, website,…

Please upload pictures that might be just nice to look at, illustrate, explain or summarize your work.)

As Title in the above form please add a copyright.

And please give a short description of the picture and the context in the above textbox.

Don’t forget to press the “Save changes” button at the bottom of the box.

If you want to add more pictures, please use the “clone”-button at the right top of the above grey box.