AA3 – Networks

Project

AA3-7

Beyond the Worst-Case: Data-Dependent Rates in Learning and Optimization

Project Heads

Sebastian Pokutta

Project Members

Francisco Criado Gallart

Project Duration

01.01.2021 − 31.12.2022

Located at

TU Berlin

Description

Worst-case complexity bounds are increasingly insufficient to explain the (often superior) real-world performance of optimization and learning algorithms. We will consider data-dependent rates, approximation guarantees, and complexity bounds to provide guarantees much more in line with actual performance. We are in particular interested in exploiting properties of the feasible region to obtain algorithms that are adaptive to the problem structure.

A more technical overview can be found [here].

External Website

Related Publications

  1. Besançon, M., Carderera, A., and Pokutta, S. (2021). FrankWolfe.jl: a high-performance and flexible toolbox for Frank-Wolfe algorithms and Conditional Gradients. Preprint[arXiv] [code]
  2. Combettes, C. W., and Pokutta, S. (2019). Revisiting the Approximate Carathéodory Problem via the Frank-Wolfe Algorithm. Preprint[arXiv] [summary] [slides] [code] [video]
  3. Kerdreux, T., Roux, C., d’Aspremont, A., and Pokutta, S. (2021). Linear Bandits on Uniformly Convex Sets. Preprint[arXiv] [summary]
  4. Carderera, A., Diakonikolas, J., Lin, C. Y., and Pokutta, S. (2021). Parameter-free Locally Accelerated Conditional Gradients. Preprint[arXiv]
  5. Combettes, C. W., Spiegel, C., and Pokutta, S. (2020). Projection-Free Adaptive Gradients for Large-Scale Optimization. Preprint[arXiv] [summary] [code]
  6. Kerdreux, T., d’Aspremont, A., and Pokutta, S. (2021). Local and Global Uniform Convexity Conditions. Preprint[arXiv]

Related Pictures