direkt zum Inhalt springen

direkt zum Hauptnavigationsmenü

Sie sind hier

TU Berlin

Inhalt des Dokuments

Es gibt keine deutsche Übersetzung dieser Webseite.

Research Colloquium (Summer Term 2022)

The research colloquium "Algorithmik und Komplexitätstheorie" provides talks of external guests, members of the research staff, Ph.D. students, and advanced students (theses) about recent results and research topics in theoretical computer science and related areas. The core areas are algorithmics and computational complexity theory. If you would like to receive information about upcoming talks, please join our mailing list. If you would like to give a talk yourself, feel free to send an email to tomohiro.koana@tu-berlin.de.

Talks are usually either given in presence at Ernst-Reuter-Platz 7, Room 512, or online at https://tu-berlin.zoom.us/j/68424360050?pwd=a2F3TlV4bzZPckxwOStOWkR4NzhuQT09.

Click on rows to expand. The schedule will be updated during the term.
Date Speaker Title
Till Fluschnik
(TU Berlin)
Placing Green Bridges Optimally, with Habitats Inducing Cycles

Choosing the placement of wildlife crossings (i.e., green bridges) to reconnect animal species’ fragmented habitats is among the 17 goals towards sustainable development by the UN. We consider the following established model: Given a graph whose vertices represent the fragmented habitat areas and whose weighted edges represent possible green bridge locations, as well as the habitable vertex set for each species, find the cheapest set of edges such that each species’ habitat is connected. We study this problem from a theoretical (algorithms and complexity) and an experimental perspective, while focusing on the case where habitats induce cycles. We prove that the NP-hardness persists in this case even if the graph structure is restricted. If the habitats additionally induce faces in plane graphs however, the problem becomes efficiently solvable. In our empirical evaluation we compare this algorithm as well as ILP formulations for more general variants and an approximation algorithm with another. Our evaluation underlines that each specialization is beneficial in terms of running time, whereas the approximation provides highly competitive solutions in practice.

Joint work with Maike Herkenrath, Francesco Grothe, and Leon Kellerhals

Niclas Boehmer
(TU Berlin)
Combating Collusion Rings is Hard but Possible

A recent report of Littmann [Commun. ACM '21] outlines the existence and the fatal impact of collusion rings in academic peer reviewing. We introduce and analyze the problem Cycle-Free Reviewing that aims at finding a review assignment without the following kind of collusion ring: A sequence of reviewers each reviewing a paper authored by the next reviewer in the sequence (with the last reviewer reviewing a paper of the first), thus creating a review cycle where each reviewer gives favorable reviews. As a result, all papers in that cycle have a high chance of acceptance independent of their respective scientific merit. We observe that review assignments computed using a standard Linear Programming approach typically admit many short review cycles. On the negative side, we show that Cycle-Free Reviewing is NP-hard in various restricted cases (i.e., when every author is qualified to review all papers and one wants to prevent that authors review each other's or their own papers or when every author has only one paper and is only qualified to review few papers). On the positive side, among others, we show that, in some realistic settings, an assignment without any review cycles of small length always exists. This result also gives rise to an efficient heuristic for computing (weighted) cycle-free review assignments, which we show to be of excellent quality in practice.

This is joint work with Robert Bredereck and André Nichterlein and accepted to AAAI 2022.

Tomohiro Koana
(TU Berlin)
The Complexity of Finding Fair Many-to-One Matchings

We analyze the (parameterized) computational complexity of ``fair'' variants of bipartite many-to-one matching, where each vertex from the ``left'' side is matched to exactly one vertex and each vertex from the ``right'' side may be matched to multiple vertices. We want to find a ``fair'' matching, in which each vertex from the right side is matched to a ``fair'' set of vertices. Assuming that each vertex from the left side has one color modeling its attribute, we study two fairness criteria. In one of them, we deem a vertex set fair if for any two colors, the difference between the numbers of their occurrences does not exceed a given threshold. Fairness is relevant when finding many-to-one matchings between students and colleges, voters and constituencies, and applicants and firms. Here colors may model sociodemographic attributes, party memberships, and qualifications, respectively. We show that finding a fair many-to-one matching is NP-hard even for three colors and maximum degree five. Our main contribution is the design of fixed-parameter tractable algorithms with respect to the number of vertices on the right side. Our algorithms make use of a variety of techniques including color coding. At the core lie integer linear programs encoding Hall like conditions. To establish the correctness of our integer programs, we prove a new separation result, inspired by Frank's separation theorem [Frank, Discrete Math. 1982], which may also be of independent interest. We further obtain complete complexity dichotomies regarding the number of colors and the maximum degree of each side.

This is joint work with Niclas Boehmer and accepted to ICALP 2022.

Nimrod Talmon
(Ben-Gurion University of the Negev)
Multi-Party Campaigning

We study a social choice setting of manipulation in elections and extend the usual model in two major ways: first, instead of considering a single manipulating agent, in our setting there are several, possibly competing ones; second, instead of evaluating an election after the first manipulative action, we allow several back-and-forth rounds to take place. We show that in certain situations, such as in elections with only a few candidates, optimal strategies for each of the manipulating agents can be computed efficiently. Our algorithmic results rely on formulating the problem of finding an optimal strategy as sentences of Presburger arithmetic that are short and only involve small coefficients, which we show is fixed-parameter tractable – indeed, one of our contributions is a general result regarding fixed-parameter tractability of Presburger arithmetic that might be useful in other settings. Following our general theorem, we design quite general algorithms; in particular, we describe how to design efficient algorithms for various settings, including settings in which we model diffusion of opinions in a social network, complex budgeting schemes available to the manipulating agents, and various realistic restrictions on adversary actions.

Joint work with Martin Koutecky.

Pascal Kunz
(TU Berlin)
Disentangling the Computational Complexity of Network Untangling

We study the network untangling problem introduced by Rozenshtein, Tatti, and Gionis [DMKD 2021], which is a variant of Vertex Cover on temporal graphs—graphs whose edge set changes over discrete time steps. They introduce two problem variants. The goal is to select at most k time intervals for each vertex such that all time-edges are covered and (depending on the problem variant) either the maximum interval length or the total sum of interval lengths is minimized. This problem has data mining applications in finding activity timelines that explain the interactions of entities in complex networks.

Both variants of the problem are NP-hard. In this paper, we initiate a multivariate complexity analysis involving the following parameters: number of vertices, lifetime of the temporal graph, number of intervals per vertex, and the interval length bound. For both problem versions, we (almost) completely settle the parameterized complexity for all combinations of those four parameters, thereby delineating the border of fixed-parameter tractability.

Joint work with Vincent Froese and Philipp Zschoche. Will appear in the proceedings of IJCAI 22.

Mathias Weller
(Université Gustave Eiffel)
Treewidth-based Algorithms for the Small Parsimony Problem on Phylogenetic Networks

As opposed to phylogenetic (that is, evolutionary) trees, phylogenetic networks (that is, special rooted DAGs) often do not lend themselves to polynomial-time analysis (unless N=NP). Therefore, my colleagues and I started considering parameterizations by width-measures. Being directed 'down' from the root, such width-measures can take advantage of the 'natural direction' of the flow of information in phylogenetic networks: often, the further we move away from the leaves, the less information we have and, thus, the more information must be inferred. However, when we tried to ignore this direction in an effort of pushing our measure to smaller values, we were surprised that it suddenly became equal to the treewidth of the underlying graph. This talk is an illustrated recap of this incident, including an example of how to do dynamic programming on phylogenetic networks using our treewidth formulation.

Leon Kellerhals
(TU Berlin)
Optimal Virtual Network Embeddings for Tree Topologies

The performance of distributed and data-centric applications often critically depends on the interconnecting network. Applications are hence modeled as virtual networks, also accounting for resource demands on links. At the heart of provisioning such virtual networks lies the NP-hard Virtual Network Embedding Problem (VNEP): how to jointly map the virtual nodes and links onto a physical substrate network at minimum cost while obeying capacities.

This paper studies the VNEP in the light of parameterized complexity. We focus on tree topology substrates, a case often encountered in practice and for which the VNEP remains NP-hard. We provide the first fixed-parameter algorithm for the VNEP with running time $O(3^r(s+r^2))$ for requests and substrates of r and s nodes, respectively. In a computational study our algorithm yields running time improvements in excess of 200x compared to state-of-the-art integer programming approaches. This makes it comparable in speed to the well-established ViNE heuristic while providing optimal solutions. We complement our algorithmic study with hardness results for the VNEP and related problems.

Joint work with Aleksander Figiel, Rolf Niedermeier, Matthias Rost, Stefan Schmid, and Philipp Zschoche. This has been presented at SPAA '21.

Frank Sommer
(Philipps-Universität Marburg)
Covering Many (or Few) Edges with k Vertices in Sparse Graphs

We study the following fixed-cardinality optimization problem in a maximization and a minimization variant. For a fixed $\alpha$ between zero and one we are given a graph and two numbers $k \in \mathbb{N}$ and $t \in \mathbb{Q}$. The task is to find a vertex subset $S$ of exactly $k$ vertices that has value at least $t$ in the maximization variant or at most $t$ in the minimization variant. Here, the value of a vertex set computes as $\alpha$ times the number of edges with exactly one endpoint in $S$ plus $1-\alpha$ times the number of edges with both endpoints in $S$. These two problems generalize many prominent graph problems, such as \textsc{Densest $k$-Subgraph}, \textsc{Sparsest $k$-Subgraph}, \textsc{Partial Vertex Cover}, and \textsc{Max ($k$,$n-k$)-Cut}.

In this work, we complete the picture of their parameterized complexity on several types of sparse graphs that are described by structural parameters. In particular, we provide kernelization algorithms and kernel lower bounds for these problems. A somewhat surprising consequence of our kernelizations is that \textsc{Partial Vertex Cover} and \textsc{Max $(k,n-k)$-Cut} not only behave in the same way but that the kernels for both problems can be obtained by the same algorithms.

Joint work with Tomohiro Koana, Christian Komusiewicz, and André Nichterlein. This has been presented at STACS 2022.

Pascal Kunz
(TU Berlin)
Most Classic Problems Remain NP-hard on Relative Neighborhood Graphs and their Relatives

Proximity graphs have been studied for several decades, motivated by applications in computational geometry, geography, data mining, and many other fields. However, the computational complexity of classic graph problems on proximity graphs mostly remained open. We now study 3-Colorability, Dominating Set, Feedback Vertex Set, Hamiltonian Cycle, and Independent Set on the proximity graph classes relative neighborhood graphs, Gabriel graphs, and relatively closest graphs. We prove that all of the problems remain NP-hard on these graphs, except for 3-Colorability and Hamiltonian Cycle on relatively closest graphs, where the former is trivial and the latter is left open. Moreover, for every NP-hard case we additionally show that no 2o(n1/4)-time algorithm exists unless the ETH fails, where n denotes the number of vertices.

Joint work with Till Fluschnik, Rolf Niedermeier, Malte Renken. Will be presented at SWAT '22.

Matthias Bentert
(TU Berlin)
Stable Matching with Multilayer Approval Preferences: Approvals can be Harder than Strict Preferences

We study stable matching problems where agents have multilayer preferences: There are l layers each consisting of one preference relation for each agent. Recently, Chen et al. [EC ’18] studied such problems with strict preferences, establishing four multilayer adaptions of classical notions of stability. We follow up on their work by analyzing the computational complexity of stable matching problems with multilayer approval preferences. We consider eleven stability notions derived from three well-established stability notions for stable matchings with ties and the four adaptions proposed by Chen et al. For each stability notion, we show that the problem of finding a stable matching is either polynomial-time solvable or NP-hard. Furthermore, we examine the influence of the number of layers and the desired “degree of stability” on the problems’ complexity. Somewhat surprisingly, we discover that assuming approval preferences instead of strict preferences does not consider- ably simplify the situation (and sometimes even makes polynomial-time solvable problems NP-hard).

Joint work with Niclas Boehmer, Klaus Heeger, and Tomohiro Koana.

Hendrik Molter
(Ben-Gurion University of the Negev)
Equitable Scheduling for the Total Completion Time Objective

We investigate a novel scheduling problem where we have n clients, each associated with a single job on each of a set of m different days. On each day, a single machine is available to process the n jobs non-preemptively. The goal is provide an equitable set of schedules for all m days such that the sum of completion times of each client over all days is not greater than some specified equability parameter k. The 1∣∣maxj∑iCi,j problem, as we refer to it in this paper, fits nicely into a new model introduced by Heeger et al. [AAAI '21] that aims at capturing a generic notion of fairness in scheduling settings where the same set of clients repeatedly submit scheduling requests over a fixed period of time. We show that the 1∣∣maxj∑iCi,j problem is NP-hard even under quite severe restrictions. This leads us to investigating two natural special cases: One where we assume the number of days to be small and one where we consider the number of clients to be small. We present several tractability results for both cases.

Joint work with Danny Hermelin, Rolf Niedermeier, and Dvir Shabtay.

Falk Hüffner
Routing algorithms at TomTom

In this talk, I will present some routing problems and algorithms that model real-world problems at TomTom, a company that produces software, services, and maps for navigation and traffic information.

The first setting is on-street parking: given information on each road segment of the probability of finding parking there, the goal is to find a route that minimizes the expected time until parking is found. This problem is NP-hard; we solve it using a simple BFS-based heuristic. A similar problem comes up when trying to minimize the expected time until finding a free charging station for an electric vehicle.

The second is about "strategic routing": Routes are calculated for more than one agent, and quality is evaluated by shared scoring. Most formulations are NP-hard, and we solve certain instances by reducing to Multi-Criteria Shortest Path.

Finally, I will talk about the implementation of the TomTom electric vehicle routing service, which calculates a route with automatically added charge stops that are required when a route is too long to be driven on the initial charge. This problem is also NP-hard, and we solve it using a heuristic based on a preprocessing of the map.

Klaus Heeger
(TU Berlin)
Excluding Linear-Time Polynomial-Sized Kernels for Polynomial-Time Solvable Problems

In recent years, the study of parameterized algorithms for polynomial-time solvable problems ("FPTinP") has increased. We continue this line of research by considering kernelization for polynomial-time solvable problems. Based on the composition framework of Bodlaender, Downey, Fellows, and Hermelin (JCSS '09), we design a framework for excluding linear-time computable, polynomial-sized kernels under popular conjectures such as the Strong Exponential Times Hypothesis, the All-Pairs-Shortest-Paths conjecture or the 3-Sum conjecture. We apply our framework to several problems from different areas, including problems on strings, problems on graphs, and problems from computational geometry.

This is ongoing work started with André Nichterlein and Rolf Niedermeier.

Zusatzinformationen / Extras