### Inhalt des Dokuments

## MaMu: Matching under Preferences: Multimodal Views and Domain Restrictions

Funded by Deutsche Forschungsgemeinschaft (Project MaMu, NI 369/19) and the National Natural Science Foundation of China, started April 2018.

Synopsis:

Given the great relevance of matching under preferences, the goals
of this joint research project are basically two-fold. First, we want
to investigate how restrictions on the allowed preferences can improve
the efficiency for several solving algorithms. Doing so, we want to
extend and improve results from the literature. Second, we plan to
study stable matching for multi-layer preferences, a new direction we
introduce here. Indeed, it is very well-motivated by the fact of
having multi-modal data available (from different sources). To this
end, after carefully introducing the relevant notions it is of central
interest to investigate how classical concepts and algorithms can be
transferred to this more general scenario. Finally, in a third part we
plan to study further models, going beyond the two mentioned main
parts. In particular, here we aim at touching on issues such as
“incremental scenarios” or game-theoretic aspects.

The
projects brings together the expertise of the Chinese and the German
research groups. Both sides have actively contributed to several
topics related to Computational Social Choice and, in particular, to
restricted preferences and issues of preference aggregation. Moreover,
both groups are heavily involved into parameterized complexity
analysis and algorithms for hard combinatorial problems in general.
Based on this common ground, the proposed project builds on (partially
joint) previous experiences and joins forces in attacking a
fundamental issue of Computational Social Choice and closely related
areas; matching under preferences, as we will demonstrate, both being
fundamental and being unexplored in several directions, allows for
numerous future innovations (conceptually and methodologically) we
want to provide.

People at TU Berlin: Niclas Böhmer [1]; Head: Rolf Niedermeier

People at Shandong University: Jiong Guo [2] and Hong Liu [3]; Head: Daming Zhu [4]

## TORE: Trade-offs in Parameterized Data Reduction

Funded by Deutsche Forschungsgemeinschaft (Project TORE, NI 369/18) and Russian Foundation for Basic Research, starting in March 2018.

Synopsis:

Kernelization is arguably one of the major contributions of parameterized complexity analysis to algorithm design. In this project, based on many years of experience with various facets of kernelization, we plan to further push the development of new aspects of kernelization, particularly focussing on various trade-off effects. In particular, we are interested in trade-offs of efficient kernelization time and effective kernel size with a number of quality measures for kernels. So we are interested in Pareto-optimal kernels, then making use of “chaining effects” by executing one kernelization after another. Further, we plan to systematically explore concepts such as partial kernelization, approximative kernelization, randomized kernelization, Turing kernelization etc., all with the ultimate goal to trade better running time and/or kernel size against some relaxation of the kernel concept. Doing so, we plan to contribute both theoretically (including new frameworks and lower bounds) and practically (up to algorithm engineering). Although our focus is on NP-hard problems, we also include polynomial-time solvable problems in our considerations.

People at TU Berlin: Till Fluschnik; Head: Rolf Niedermeier

People at Novosibirsk State University: Oxana Tsidulko; Head: René van Bevern [5]

## MATE: Multivariate Algorithmics for Temporal Graph Problems

Funded by Deutsche Forschungsgemeinschaft (Project MATE, NI 369/17) started in February 2018.

Synopsis:

The focus of this project shall lie on temporal graphs: We are
given a fixed vertex set and discrete time steps such that at each
time step we may have another edge set. We aim to develop efficient
(parameterized) algorithms for some of the most central problems
related to temporal graphs. The topics we study herein are

•
community detection and clustering, in particular with respect to
their evolution over time,• routing problems on temporal graphs or
with temporal aspects,

• multi-layer graphs and preservation
problems.

We aim to fill a gap in the current literature by
concentrating mostly on exact combinatorial algorithms in the context
of parameterized complexity. That is, we study the influence of small
parameters on the complexity of our problems. To this end, we need to
extend the toolbox of parameterized algorithmics with new approaches
that deal with the temporal aspects of the given problems.

Using
problems from the above contexts in case studies, we aim to identify
key properties of temporal graphs that enable or hinder the
development of efficient algorithms. That is, we aim to identify
important parameters that exhibit the structure of temporal graphs and
we want to study the ways in which this structure can be exploited to
find efficient algorithms. In this way, we aim to contribute in equal
parts to the theory of temporal graphs as well as to practical
solutions for temporal graph problems.

People: Hendrik Molter, Malte Renken

Head: Rolf Niedermeier

## FPTinP: More Efficient Algorithms for Polynomial-Time Solvable Graph Problems

Funded by Deutsche Forschungsgemeinschaft (Project FPTinP, NI 369/16) started in December 2017.

Synopsis:

The central goal of this project is to extend the focus of
parametrized algorithmics from NP-hard problems to polynomial-time
solvable problems. This is motivated by the observation that
polynomial-time algorithms with cubic running times are for many
applications questionable at least; this is a fact that is
particularly obvious in the context of Big Data.

In contrast, an
algorithm with running in O(k^4 · n) time for some problem specific
parameter can be much more desirable provided that k is much smaller
than n in the application. Like in classical parameterized
algorithmics, we want to focus on graph problems with polynomial-time
algorithms using the rich arsenal of parameterized methods and
techniques developed so far.

People: Tomohiro Koana

Head: Rolf Niedermeier

## AFFA: Algorithms for Fair Allocations

Funded by Deutsche Forschungsgemeinschaft (Project AFFA, NI 369/15 and BR5207/1) started in October 2016.

Synopsis:

The main focus of AFFA is the investigation of important computational problems in context of fair allocations with respect to their algorithmic tractability. The goals include basic problems such as the verification of the existence of fair allocations, finding the corresponding allocations, and the computational complexity of specific fixed protocols that find the allocations. Moreover, we want to analyze problems in context of manipulation, bribery, and external control. The underlying problems very often are NP-hard, which also motivates the application of parameterized, approximation, and heuristic approaches. In particular, we want to identify the limits of approximability in polynomial and FPT time. In addition, efficient reductions to established standard solving methods (like highly optimized SAT or ILP solvers) seem promising to find exact solutions for at least for some inputs. Finally, we also plan to test the developed algorithms with respect to their practical usefulness by implementations and experimental investigations and also want to make our software publicly available.

People at TU Berlin: Andrzej Kaczmarczyk; Head: Rolf Niedermeier

People at Humboldt-Universität zu Berlin: Robert Bredereck [6]

## DAMM: Data Reduction in Parameterized Algorithmics - New Models and Methods

Funded by Deutsche Forschungsgemeinschaft (Project DAMM, NI 369/13) started in December 2012.

Synopsis:

Preprocessing by means of data reduction rules is a common approach for the algorithmic treatment of combinatorially difficult and compute-intensive problems. Parameterized Algorithmics formalizes the concept of preprocessing in terms of problem kernels. So far, research mainly focussed on the size of the problem kernel and lacked a broader analysis of preprocessing. Thus, the goal of DAMM is to investigate different aspects of preprocessing taking into consideration other interesting properties of problem kernels, such as the running time for example. We also aim at developing kernelizations with respect to "non-standard" and "multivariate" parameterizations which could be of practical interest. Finally, we will consider different kernelization models that generalize the concept of problem kernels, such as Turing kernels, analyzing their relations and possible applications.

Head: Rolf Niedermeier

las/

493