We are happy to announce the workshop “Optimization meets Parameterized Complexity” on December 13-14, 2016. Invited speakers are

- Fabrizio Grandoni (IDSIA, Lugano)
- Tobias Harks (Universität Augsburg)
- Bart Jansen (Eindhoven University of Technology)
- Erik Jan van Leeuwen (Max-Planck-Institut für Informatik, Saarbrücken)
- Volker Kaibel (Otto-von-Guericke-Universität Magdeburg)
- Lukasz Kowalik (University of Warsaw)
- Peter Rossmanith (RWTH Aachen)
- Gerhard Woeginger (RWTH Aachen)

Participation is free of charge. For organizational purposes, **please register if you plan to attend by November 27**.
To do so, please write an e-mail with the subject “Workshop Registration” to secret5@cs.uni-bonn.de containing your name, address, and affiliation. Please also write in the e-mail if you plan to participate in the workshop on both days.

The workshop will take place at Universitätsclub Bonn. The venue is located at Konviktstraße 9, 53113 Bonn (map).

The workshop is jointly organized by

- Stefan Kratsch (Theoretical Computer Science, University of Bonn)
- Britta Peis (Management Science, RWTH Aachen University)
- Heiko Röglin (Theoretical Computer Science, University of Bonn)

The workshop is financially supported by the University of Cologne (seed funding for project „ABC - New algorithmic challenges in operations research“).

Tuesday December 13:

- 09:00 - 09:15 Opening
- 10:15 - 10:45 Coffee Break
- 10:45 - 11:45 Peter Rossmanith "Treewidth"
- 12:15 - 13:15 Lunch Break
- 14:45 - 15:15 Coffee Break

Wednesday December 14:

- 10:00 - 10:30 Coffee Break
- 12:00 - 13:00 Lunch Break
- 13:00 - 14:00 Volker Kaibel "Describing Integer Points in Polyhedra"
- 14:00 - 14:30 Coffee Break
- 14:30 - 15:30 Gerhard Wöginger "Lower bounds for easy problems"
- 15:30 - 16:00 Closing

When solving a hard computational problem, the running time can often be reduced by using a preprocessing step that throws away irrelevant parts of the data which are guaranteed not to affect the final answer. Until recently, there was no good explanation for the effectiveness of preprocessing. This changed when the notion of kernelization was developed within the field of parameterized complexity. It has been called “the lost continent of polynomial time”, since the exploration of the formal model of preprocessing captured by kernelization has led to a surprisingly rich set of techniques that can reduce the size of NP-hard problem inputs in polynomial time, without changing the answer. Using a user-defined complexity-parameter, one can also give theoretical guarantees on the amount of data reduction that is achieved. This talk gives an introduction to kernelization by showcasing some of the gems of the area: elegant preprocessing schemes built on nontrivial mathematical insights.

Treewidth is a concept that has independently been invented many times. It measures how “treelike” a graph is. Having low treewidth enables us to solve many otherwise hard problems quite efficiently. Even if graphs do not have small treewidth, treewidth based techniques are still useful: Very often a graph problem can efficiently be simplified if the treewidth is high and efficiently solved if the treewidth is low. Apart from specialised approaches to solve hard problems using small treewidth there exist very general meta-theorems for a huge class of decision and optimization problems. There are even reasons to believe that low treewidth is a necessary condition for the existence of polynomial time algorithms for many problems.

A graph G = (V, E) is called equidominating if there exists a value t ∈ N and a weight function ω : V → N such that the total weight of a subset D ⊆ V is equal to t if and only if D is a minimal dominating set. To decide whether or not a given graph is equidominating is referred to as the equidomination problem. In this paper we show that the equidomination problem is ﬁxed-parameter tractable when parameterized in two different ways: The ﬁrst parameterization considers the target value t leading to the target-t equidomination problem. The second parameterization allows only weights up to a ﬁxed value k, which yields the k-equidomination problem.

In this talk, we focus on two major problems in Network Design. First, we consider the Steiner Tree problem, which is fixed-parameter tractable on general graphs. However, this algorithm can be significantly improved upon for planar graphs when parameterized by the size of the tree. In fact, it has a polynomial kernel (Pilipczuk, Pilipczuk, Sankowski, van Leeuwen, FOCS 2014), which does not exist for general graphs (under a complexity-theoretic assumption). We also discuss why, on planar graphs, Steiner Tree behaves very differently from many other problems. Second, we consider the Disjoint Paths problem, which also has a parameterized algorithm on general graphs but no kernel (under a complexity-theoretic assumption). We show that for this problem and several of its variations, significantly faster algorithms and even polynomial kernels exist on a variety of graph classes.

(Joint work with Martin Groß, Anupam Gupta, Amit Kumar, Jannik Matuschke, Melanie Schmidt and José Verschae)

The dynamic Steiner Forest problem is a long standing open problem: Terminal pairs arrive over time and we want to maintain a competitve solution using few local changes. We try to shed some light onto the question by giving a local search algorithm for the traditional Steiner Forest problem. Even though we cannot answer the open problem, similar techniques were essential to tackle the dynamic MST/Steiner Tree problem. Our algorithm runs in polynomial time and has a constant locality gap.

Deciding whether a given graph contains a path of length k is a natural generalization of the classic Hamiltonian path problem. In 1994, Alon et al invented the nowadays standard color-coding technique and solved the k-path problem in 5.5^k*poly(n) time. However, it only started the race of faster and faster FPT algorithms for this problem. In 2008 Koutis presented a completely new, algebraic approach for the k-path problem. Two years later Bjorklund reformulated the ideas of Koutis in the language of evaluating polynomials. This resulted in breaking the 2^n barrier for deciding Hamiltonicity and a later in k-path algorithm running in time 1.66^k*poly(n) due to Bjorklund et al. During the talk we will illustrate this approach by a simpler (but slower) algorithm running in time 2^k*poly(n). If time permits, we will sketch the improvements leading to breaking the 2^k barrier.

I will describe an algorithm for the subset sum problem that runs in 2^{0.86n} time and uses polynomial pace. Previously, all algorithms with running time less than 2^n used exponential space, and obtaining such a guarantee was open. Our algorithm is based on Floyd’s space efficient technique for cycle finding, and builds on some recent work of Beame et al. Our technique also leads to faster space efficient algorithms for some other problems. Based on joint work with Nikhil Bansal, Jesper Nederlof and Nikhil Vyas.

(Joint work with: Aris Anagnostopoulos, Salvatore Ingala, Stefano Leonardi, Tobias Moemke, Sumedha Uniyal, Andreas Wiese, Hang Zhou.)

In the unsplittable flow on a path problem (UFP) we are given a path with non-negative edge capacities and a set of tasks, each one characterized by a subpath, a demand, and a profit. Our goal is to select a subset of tasks of maximum total profit so that the total demand of the selected tasks on each edge does not exceed the respective edge capacity.

I will overview some of the approximation algorithms for UFP and related problems, including my work on this topic.

This is joint work with Max Klimm and Britta Peis.

The talk will discuss the sensitivity of optimal solutions of convex separable optimization problems over an integral polymatroid base polytope with respect to parameters determining both the cost of each element and the polytope. Under convexity and a regularity assumption on the functional dependency of the cost function with respect to the parameters, it is shown show that reoptimization after a change in parameters can be done by elementary local operations. I will show that these sensitivity results can be applied to a new class of non-cooperative games played on integral polymatroid base polytopes in order to compute pure Nash equilibria.

The Subset Feedback Vertex Set problem generalizes the classical Feedback Vertex Set problem and asks, for a given undirected graph G=(V,E), a subset S of V, and an integer k, whether there exists a set X of at most k vertices such that no cycle in G-X contains a vertex of S. It was independently shown by Cygan et al. (ICALP '11, SIDMA '13) and Kawarabayashi and Kobayashi (JCTB '12) that Subset Feedback Vertex Set is fixed-parameter tractable for parameter k. Cygan et al. asked whether the problem also admits a polynomial kernelization.

We answer the question of Cygan et al. positively by giving a randomized polynomial kernelization. In a first step we show that Subset Feedback Vertex Set has a randomized polynomial kernel parameterized by |S|+k with O(|S|² * k) vertices. For this we use the matroid-based tools of Kratsch and Wahlström (FOCS '12) that for example were used to obtain a polynomial kernel for s-Multiway Cut. Next we present a preprocessing that reduces the given instance (G,S,k) to an equivalent instance (G',S',k') where the size of S' is bounded by O(k^4). These two results lead to a polynomial kernel for Subset Feedback Vertex Set with O(k^9) vertices.

(Joint work with Stefan Weltge)

Linear mixed integer models are fundamental in treating combinatorial problems via Mathematical Programming. In this lecture we are going to discuss the question how small such formulations one can obtain for different problems. It turns out that for several problems including, e.g., the traveling salesman problem and the spanning tree problem, the use of additional variables is essential for the design of polynomial sized integer programming formulations. In fact, we prove that their standard exponential size formulations are asymptotically minimal among the formulations based on incidence vectors only. We also treat bounds for general sets of 0/1-points and briefly discuss the question for the role of rationality of coefficients in formulations.

There is only a handful of tools for establishing lower bounds on the time complexity of polynomially solvable problems: Ben-Or's Algebraic-Computation-Tree model of computation allows Omega(nlogn) lower bounds for problems like sorting or element uniqueness. The concept of 3-SUM-hardness of Gajentaan & Overmars yields (conditional) quadratic lower bounds for a variety of problems in computational geometry. The All-Pairs-Shortest-Paths conjecture of Williams & Williams yields (conditional) cubic lower bounds for many natural problems in graph algorithms. The talk surveys these approaches, and provides a number of illustrating examples.