Confluent Data Reduction for Edge Clique Cover: A Bridge Between Graph Transformation and Kernelization
 (TU Berlin)
Kernelization is a core tool of parameterized algorithmics for coping with computationally intractable problems. A kernelization reduces in polynomial time an input instance to an equivalent instance whose size is bounded by a function only depending on some problem-specific parameter k; this new instance is called problem kernel. Typically, problem kernels are achieved by performing efficient data reduction rules. So far, there was little study in the literature concerning the mutual interaction of data reduction rules; in particular, it seems unstudied whether data reduction rules for a specific problem always lead to the same reduced instance, no matter in which order the rules are applied. This corresponds to the concept of confluence from rewriting system theory. We argue that it is interesting to study whether a kernelization is confluent. To this end, we use the NP-hard graph problem (Edge) Clique Cover as a running example, which asks for the smallest set of cliques in a graph such that each edge is contained in at least one clique. We provide a linear-time confluent kernelization for this problem and demonstrate how to apply the concept of critical pair analysis from graph transformation theory to achieve this, supported by the AGG software tool. We generalize our investigations to the Partial Clique Cover problem, which requires a more involved analysis. These results support the main goal of our work, namely, to establish a link between (parameterized) algorithmics and graph transformation theory, two so far unrelated fields whose interaction promises to be beneficial for both sides.
This is a joint work with Hartmut Ehrig, Claudia Ermel, Rolf Niedermeier, and Olga Runge.
* Language is depending on
Back to the research colloquium site.