Inhalt des Dokuments
Es gibt keine deutsche Übersetzung dieser Webseite.
Novel Directions in Data Reduction for NP-Hard Problems: Truth-Table Kernelizations
Manuel Sorge (TU Berlin)
Kernelization, that is provably efficient and effective data reduction for NP-hard problems, is a key concept of parameterized computational complexity analysis. In a nutshell, a kernelization is a polynomial-time executable data reduction procedure that shrinks the original input instance to an equivalent one whose size can be upper-bounded by a function solely depending on a chosen problem-specific parameter (e.g., solution size). Deciding the existence of polynomial-size problem kernels is of central interest in parameterized complexity analysis. The standard kernelization concept asks for "many-one" problem kernels. In recent years, several non-existence results concerning polynomial-size many-one problem kernels based on the assumption that NP is not a subset of coNP/poly have been proven. This motivates the consideration of more relaxed problem kernel notions such as Turing kernels. Here, the input instance is translated into multiple small instances. While Turing kernelization allows "adaptive queries", the few known Turing kernels ignore this feature and are, therefore, parallelizable. In this work, we start exploring the power and limitations of such "truth-table" kernelizations, relate them to Turing as well as many-one kernelizations, and argue for their practical and theoretical relevance.
This is joint work with Mathias Weller and Rolf Niedermeier.
Back to the research colloquium site.