Beyond the Worst-Case Analysis of Algorithms

The complexity of many optimization problems and algorithms seems well understood. However, often theoretical results contradict observations made in experiments and practice. Some NP-hard optimization problems can be solved efficiently in practice and for many problems algorithms with exponential worst-case running time outperform polynomial-time algorithms. The reason for this discrepancy is the pessimistic worst-case perspective, in which the performance of an algorithm is solely measured by its behavior on the worst possible input. In order to provide a more realistic measure that can explain the practical performance of algorithms, various alternatives to worst-case analysis have been suggested. We study both probabilistic models in which inputs are to some extent random and deterministic models in which inputs exhibit certain structural properties.

Smoothed Analysis of Algorithms

The framework of smoothed analysis is a hybrid of worst-case and average-case analysis. In this model, inputs are generated in two steps: first, an adversary chooses an arbitrary instance, and then this instance is slightly perturbed at random. The smoothed performance of an algorithm is defined to be the worst expected performance the adversary can achieve. This model can be viewed as a less pessimistic worst-case analysis, in which the randomness rules out pathological worst-case instances that are rarely observed in practice but dominate the worst-case analysis. Smoothed analysis has been introduced in 2001 by Daniel Spielman and Shang-Hua Teng who showed that the smoothed running time of a certain version of the simplex algorithm is polynomial. Since then, smoothed analysis has been applied to a wide variety of different problems and algorithms. We have applied smoothed analysis, amongst others, to multi-objective optimization problems and local search heuristics like 2-Opt for the TSP and Lloyd’s algorithm for k-means clustering. Our analyses explain why these algorithms are efficient in practice despite their exponential worst-case running time.

Selected Publications

  • Michael Etscheid, Heiko Röglin: Smoothed Analysis of Local Search for the Maximum-Cut Problem. ACM Transactions on Algorithms, 13(2), 2017.
  • Matthias Englert, Heiko Röglin, Berthold Vöcking: Smoothed Analysis of the 2-Opt Algorithm for the General TSP. ACM Transactions on Algorithms, 13(1), 2016.
  • Tobias Brunsch, Kamiel Cornelissen, Bodo Manthey, Heiko Röglin, Clemens Rösner: Smoothed Analysis of the Successive Shortest Path Algorithm. SIAM Journal on Computing, 44(6):1798-1819, 2015.
  • Tobias Brunsch, Heiko Röglin: Improved Smoothed Analysis of Multiobjective Optimization. Journal of the ACM, 62(1), 2015.
  • Matthias Englert, Heiko Röglin, Berthold Vöcking: Worst Case and Probabilistic Analysis of the 2-Opt Algorithm for the TSP. Algorithmica, 68(1):190-264, 2014.
  • David Arthur, Bodo Manthey, Heiko Röglin: Smoothed Analysis of the k-Means Method. Journal of the ACM, 58(5), 2011.

Contact


Page Tools