EEMCS-AM-MOR

I got a PhD in computer science from the University of Lübeck and a "Habilitation" from Saarland University, and I held postdoctoral positions at Saarland University and at Yale University. Since 2009, I am affiliated with the University of Twente.

Organisaties

My research area is design and analysis of algorithms, in particular, smoothed and probabilistic analysis of algorithms.

Publicaties

2024
Worst-Case and Smoothed Analysis of the Hartigan-Wong Method for k-Means ClusteringIn 41st International Symposium on Theoretical Aspects of Computer Science, STACS 2024, Article 52. Dagstuhl. Manthey, B. & van Rhijn, J.https://doi.org/10.4230/LIPIcs.STACS.2024.52
2023
2022
2021
2020

Onderzoeksprofielen

Lopende projecten

Rigorous Analysis of Local Search

Large-scale optimization problems appear in many areas, ranging from engineering over scheduling to the sciences. Unfortunately, for many optimization problems it is unlikely that we can find optimal solutions efficiently. Still, in practice often quite simple local search heuristics succeed in finding close-to-optimal solutions surprisingly quickly. This is at stark contrast to their theoretically predicted performance, which is usually very poor.

A challenge in the analysis of algorithms is to understand why simple local search heuristics show such a remarkable performance. Beyond plain curiosity, the reason that we want to understand the performance of heuristics is two-fold: First, proving rigorous bounds on the performance provides stronger guarantees than experiment evaluation alone. Second, understanding precedes improving - once we understand why and when a heuristic works well and when it fails, we can exploit this to design better heuristics. Smoothed analysis is a paradigm to analyze algorithms where classical worst-case analysis fails. Although smoothed analysis is still a young field, it has proved to be a successful tool to analyze a variety of algorithms.

The goal of this project is to prove rigorous bounds on the performance of heuristics in the framework of smoothed analysis. Beyond "pure" local search algorithms, we will make another step towards rigorously analyzing algorithms used in practice by considering hybrid heuristics and metaheuristics.

Voltooide projecten

Smoothed Analysis of Belief Propagation

Belief propagation is a heuristic approach for solving large-scale statistical inference problems. It is an easy-to-implement heuristic based on message-passing that usually converges quickly. As such, it has become very popular, and it has proved to be successful in a wide range of applications. Its success in practice, however, is at stark contrast to the lack of theoretical understanding of its performance.

Belief propagation shares this aspect with many algorithms. The reason for the discrepancy between remarkably good performance in practice and unsatisfactory theoretical understanding is often that performance is analyzed in terms of worst-case analysis; worst-case analysis is often far too pessimistic. Indeed, worst-case instances are often artificially constructed and rarely show up in applications. An adequate theoretical analysis, however, should measure the performance in terms of ``typical'' rather than "pathological" instances. To provide a more realistic analysis of algorithms, the concept of smoothed analysis has been developed: An adversary specifies an instance, and then the expected performance is measured when this instance is slightly randomly perturbed. Smoothed analysis takes into account that practical data is often noisy, e.g., due to measurement errors. Smoothed analysis has already been applied successfully to explain the performance of a variety of algorithms.

The aim of this project is a smoothed analysis of belief propagation. We will focus on the application of belief propagation to combinatorial optimization problems. The goal is to get a deeper understanding of its performance and to bridge the gap between theoretical and practical performance of belief propagation.

Framework for Random Metric Spaces

Large-scale optimization problems show up in many domains, such as engineering, scheduling, economics, but also, e.g., in the sciences. Unfortunately, finding optimal solutions within reasonable time is often impossible because the problems that have to be solved are computationally intractable. Because of this, optimization problems are nowadays often attacked using ad-hoc heuristics. Many such heuristics show a remarkable performance, but their theoretical (worst-case) performance is poor - worst-case analysis is often too pessimistic to reflect the performance observed. In order to explain the performance of heuristics, probabilistic analysis is the method of choice, where performance is analyzed with respect to random instances.

The instances of many optimization problems involve, implicitly or explicitly, a metric space. This can be physical distances, but also, e.g., costs for travel or ransportation. Up to now, however, probabilistic analysis of algorithms is almost exclusively restricted to Euclidean instances or the distances are drawn independently, disregarding the metric nature. Both approaches fall short of explaining the average-case performance of heuristics on general metric instances.

Our goal is to develop and apply a framework for random metric spaces. We develop models for random metric spaces, study their properties, and apply these findings to explain the performance of heuristics for optimization problems. With this framework, average-case analysis of heuristics becomes a more realistic benchmark and yields more conclusive insights about performance than with the traditionally used models. This pushes our understanding of heuristics forward, helps users to choose the right algorithms, and facilitates the design of better algorithms.

Scan de QR-code of
Download vCard