direkt zum Inhalt springen

direkt zum Hauptnavigationsmenü

Sie sind hier

TU Berlin

Inhalt des Dokuments

Es gibt keine deutsche Übersetzung dieser Webseite.

Research Colloquium (Summer Term 2020)

The research colloquium "Algorithmik und Komplexitätstheorie" provides talks of external guests, members of the research staff, Ph.D. students, and advanced students (theses) about recent results and research topics in theoretical computer science and related areas. The core areas are algorithmics and computational complexity theory. If you would like to receive information about upcoming talks, please join our mailing list. If you would like to give a talk yourself, feel free to send an email to m.renken@tu-berlin.de.

Due to the Coronavirus situation, this semester’s talks are given online, usually at meet.akt.tu-berlin.de/b/mal-jkm-myf.

Click on rows to expand. The schedule will be updated during the term.
Date Speaker Title
Niclas Boehmer
(TU Berlin)
Fine-Grained View on Bribery for Group Identification

Given a set of individuals qualifying or disqualifying each other, group identification is the task of identifying a socially qualified subgroup of individuals. Social qualification depends on the specific rule used to aggregate individual qualifications. The bribery problem in this context asks how many agents need to change their qualifications in order to change the outcome.

Complementing previous results showing polynomial-time solvability or NP-hardness of bribery for various social rules in the constructive (aiming at making specific individuals socially qualified) or destructive (aiming at making specific individuals socially disqualified) setting, we provide a comprehensive picture of the parameterized computational complexity landscape. Conceptually, we also consider a more fine-grained concept of bribery cost, where we ask how many single qualifications need to be changed, as well as a more general bribery goal that combines the constructive and destructive setting.

Leon Kellerhals
(TU Berlin)
Parameterized Complexity of Geodetic Set

A vertex set S of a graph G is *geodetic* if every vertex of G lies on a shortest path between two vertices in S. Given a graph G and a natural number k, GEODETIC SET asks whether there is a geodetic set of size at most k. We study the parameterized complexity of GEODETIC SET with respect to structural parameters and show dichotomy results: We present fixed-parameter algorithms with respect to the feedback edge number and with respect to the tree-depth. On the negative side, we show that the problem is W[1]-hard when parameterized by the feedback vertex number and the path-width.

Thorsten Koch
(Zuse Institute Berlin)
Solving Mixed Integer Programs

In the talk, we will try to sketch the answers to the following questions: - What is a MILP and what can you do with it? - How to find a global optimal solution to a NP-hard problem? - How deep can we cut? - Why is solving LPs so important to MILP? - How does Max Muster MILP looks like? - How to extend to solving MINLP? - Want to see youself?

Klaus Heeger
(TU Berlin)
Multistage Graph Problems on a Global Budget

Time-evolving or temporal graphs gain more and more popularity when studying the behavior of complex networks. In this context, the multistage view on computational problems is among the most natural frameworks. Roughly speaking, herein one studies the different (time) layers of a temporal graph (effectively meaning that the edge set may change over time, but the vertex set remains unchanged), and one searches for a solution of a given graph problem for each layer. The twist in the multistage setting is that the solutions found must not differ too much between subsequent layers. We relax on this notion by introducing a global instead of the local budget view studied so far. More specifically, we allow for few disruptive changes between subsequent layers but request that overall, that is, summing over all layers, the degree of change is upper-bounded. Studying several classical graph problems (both NP-hard and polynomial-time solvable ones) from a parameterized angle, we encounter both fixed-parameter tractability and parameterized hardness results. Somewhat surprisingly, we find that sometimes the global multistage versions of NP-hard problems such as Vertex Cover turn out to be computationally more tractable than the ones of polynomial-time solvable problems such as Matching.

Tomohiro Koana
(TU Berlin)
Exploiting c-Closure in Kernelization Algorithms for Graph Problems

A graph is c-closed if every pair of vertices with at least c common neighbors is adjacent. The c-closure of a graph G is the smallest number such that G is c-closed. Fox et al. [ICALP '18] defined c-closure and investigated it in the context of clique enumeration. We show that c-closure can be applied in kernelization algorithms for several classic graph problems. We show that Dominating Set admits a kernel of size k^O(c), that Induced Matching admits a kernel with O(c^7 * k^8) vertices, and that Irredundant Set admits a kernel with O(c^(5/2) * k^3) vertices. Our kernelization exploits the fact that c-closed graphs have polynomially-bounded Ramsey numbers, as we show.

Hendrik Molter
(TU Berlin)
Algorithmic Aspects of Temporal Betweenness

The betweenness centrality of a graph vertex measures how often this vertex is visited on shortest paths between other vertices of the graph. In the analysis of many real-world graphs or networks, betweenness centrality of a vertex is used as an indicator for its relative importance in the network. In particular, it is among the most popular tools in social network analysis. In recent years, a growing number of real-world networks is modeled as temporal graphs instead of conventional (static) graphs. In a temporal graph, we have a fixed set of vertices and there is a finite discrete set of time steps and every edge might be present only at some time steps. While shortest paths are straightforward to define in static graphs, temporal paths can be considered “optimal” with respect to many different criteria, including length, arrival time, and overall travel time (shortest, foremost, and fastest paths). This leads to different concepts of temporal betweenness centrality, posing new challenges on the algorithmic side. We provide a systematic study of temporal betweenness variants based on various concepts of optimal temporal paths. Computing the betweenness centrality for vertices in a graph is closely related to counting the number of shortest between vertex pairs. We show that counting foremost and fastest paths is computationally intractable (#P-hard) and hence the computation of the corresponding temporal betweenness values is intractable as well. For shortest paths and two selected special cases of foremost paths, we devise polynomial-time algorithms for temporal betweenness computation. Moreover, we also explore the distinction between strict (ascending time labels) and non-strict (non-descending time labels) time labels in temporal paths. In our experiments with established real-world temporal networks, we demonstrate the practical effectiveness of our algorithms, compare the various betweenness concepts, and derive recommendations on their practical use.

Joint work with Sebastian Buß, Rolf Niedermeier, and Maciej Rymar.

Till Fluschnik
(TU Berlin)
Multistage Committee Election

Electing a single committee of a small size is a classic and well-understood voting situation. Being interested in a sequence of committees, we introduce and study two time-dependent multistage models based on simple Plurality voting. Therein, we are given a sequence of voting profiles (stages) over the same set of agents and candidates, and our task is to find a small committee for each stage of high score. In the conservative model we additionally require that any two consecutive committees have a small symmetric difference. Analogously, in the revolutionary model we require large symmetric differences. We study the classic computational and parameterized complexity of both problems.

Joint work with Robert Bredereck and Andrzej Kaczmarczyk.

Junjie Luo
(TU Berlin)
On Improving Resource Allocations by Sharing

Given an initial resource allocation, where some agents may envy others or a different distribution of resources might lead to higher social welfare, our goal is to improve the allocation without reassigning resources. We consider a sharing concept allowing resources being shared with social neighbors of the resource owners. To this end, we introduce a formal model that allows a central authority to compute an optimal sharing between neighbors based on an initial allocation. Pioneering this point of view, we focus on the most basic scenario where each agent may share at most one resource with one of its neighbors. We present algorithms for optimizing utilitarian and egalitarian social welfare of allocations and reducing the number of envious agents. In particular, we examine the computational complexity with respect to several natural parameters. Furthermore, we study cases with simple social network structures and, among others, devise polynomial-time algorithms in path- and tree-like social networks.

This is joint work with Robert Bredereck, Andrzej Kaczmarczyk, Rolf Niedermeier, and Florian Sachse.

Malte Renken
(TU Berlin)
Connectivity Threshold in Random Temporal Graphs

In a temporal graph edges are time-labeled and any path must follow an increasing sequence of time labels. It is well-known that in the Erdős–Rényi model for random graphs, log(n)/n is a threshold function for connectedness, meaning that a random graph is connected asymptotically almost surely if and only if the edge density is above log(n)/n. We propose a similar model for random temporal graphs and investigate at which density they become connected.

George Mertzios
(Durham University)
Binary Search in Graphs Revisited

In the classical binary search in a path the aim is to detect an unknown target by asking as few queries as possible, where each query reveals the direction to the target. This binary search algorithm has been recently extended by [Emamjomeh-Zadeh et al., STOC, 2016] to the problem of detecting a target in an arbitrary graph. Similarly to the classical case in the path, the algorithm of Emamjomeh-Zadeh et al. maintains a candidates' set for the target, while each query asks an appropriately chosen vertex-- the "median" --which minimises a potential Phi among the vertices of the candidates' set. In this paper we address three open questions posed by Emamjomeh-Zadeh et al., namely (a) detecting a target when the query response is a direction to an approximately shortest path to the target, (b) detecting a target when querying a vertex that is an approximate median of the current candidates' set (instead of an exact one), and (c) detecting multiple targets, for which to the best of our knowledge no progress has been made so far. We resolve questions (a) and (b) by providing appropriate upper and lower bounds, as well as a new potential Gamma that guarantees efficient target detection even by querying an approximate median each time. With respect to (c), we initiate a systematic study for detecting two targets in graphs and we identify sufficient conditions on the queries that allow for strong (linear) lower bounds and strong (polylogarithmic) upper bounds for the number of queries. All of our positive results can be derived using our new potential Gamma that allows querying approximate medians.

Robert Bredereck
(TU Berlin)
Maximizing the Spread of an Opinion in Few Steps: Opinion Diffusion in Non-Binary Networks

We consider the setting of asynchronous opinion diffusion with majority threshold: given a social network with each agent assigned to one opinion, an agent will update its opinion if the majority of its neighbors agree on a different opinion. The stabilized final outcome highly depends on the sequence in which agents update their opinion. We are interested in optimistic sequences---sequences that maximize the spread of a chosen opinion. We complement known results for two opinions where optimistic sequences can be computed in time and length linear in the number of agents. We analyze upper and lower bounds on the length of optimistic sequences, showing quadratic bounds in the general and linear bounds in the acyclic case. Moreover, we show that in networks with more than two opinions determining a spread-maximizing sequence becomes intractable; surprisingly, already with three opinions the intractability results hold in highly restricted cases, e.g., when each agent has at most three neighbors, when looking for a short sequence, or when we aim for approximate solutions.

Danny Hermelin
(Ben-Gurion University of the Negev)
Scheduling Lower Bounds via AND Subset Sum

Given N instances (X_1,t_1),…,(X_N,t_N) of Subset Sum, the AND Subset Sum problem asks to determine whether all of these instances are yes-instances; that is, whether each set of integers X_i has a subset that sums up to the target integer t_i. We prove that this problem cannot be solved in time Õ((N * t_max)^(1-eps)), for t_max=max_i t_i and any eps > 0, assuming the forall exists Strong Exponential Time Hypothesis (forall exists-SETH). We then use this result to exclude Õ(n+P_max * n^(1-eps))-time algorithms for several scheduling problems on n jobs with maximum processing time P_max, assuming forall exists-SETH. These include classical problems such as 1||sum w_jU_j, the problem of minimizing the total weight of tardy jobs on a single machine, and P_2||sum U_j, the problem of minimizing the number of tardy jobs on two identical parallel machines.

Valentin Mayer-Eichberger
(TU Berlin)
Introduction to SAT Programming

Solving NP-Complete problems by using off-the-shelf SAT solvers has been a promising approach for many types of problems. One of the challenges lies in the translation to CNF such that SAT solvers are able to prune the search space as efficiently as possible. In this talk I will introduce the concept of SAT Programming and present various results from my thesis. My research explores the design space of CNF encodings for intermediate representations (or constraints) such as linear integer inequalities or decision diagrams. In particular, I analyse the criteria of consistency through unit propagation and establish a quality criteria to separate good and bad encodings.

Zusatzinformationen / Extras