Results 1 
4 of
4
Data Reduction and Exact Algorithms for Clique Cover
, 2007
"... To cover the edges of a graph with a minimum number of cliques is an NPhard problem with many applications. We develop for this problem efficient and effective polynomialtime data reduction rules that, combined with a search tree algorithm, allow for exact problem solutions in competitive time. Th ..."
Abstract

Cited by 14 (2 self)
 Add to MetaCart
To cover the edges of a graph with a minimum number of cliques is an NPhard problem with many applications. We develop for this problem efficient and effective polynomialtime data reduction rules that, combined with a search tree algorithm, allow for exact problem solutions in competitive time. This is confirmed by experiments with realworld and synthetic data. Moreover, we prove the fixedparameter tractability of covering edges by cliques.
Known algorithms for edge clique cover are probably optimal. Manuscript ArXiV: 1203.1754v1
, 2012
"... In the EDGE CLIQUE COVER (ECC) problem, given a graph G and an integer k, we ask whether the edges of G can be covered with k complete subgraphs of G or, equivalently, whether G admits an intersection model on kelement universe. Gramm et al. [JEA 2008] have shown a set of simple rules that reduce t ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
(Show Context)
In the EDGE CLIQUE COVER (ECC) problem, given a graph G and an integer k, we ask whether the edges of G can be covered with k complete subgraphs of G or, equivalently, whether G admits an intersection model on kelement universe. Gramm et al. [JEA 2008] have shown a set of simple rules that reduce the number of vertices of G to 2k, and no algorithm is known with significantly better running time bound than a bruteforce search on this reduced instance. In this paper we show that the approach of Gramm et al. is essentially optimal: we present a polynomial time algorithm that reduces an arbitrary 3CNFSAT formula with n variables and m clauses to an equivalent ECC instance (G, k) with k = O(log n) and V (G)  = O(n+m). Consequently, there is no 22o(k)poly(n) time algorithm for the ECC problem, unless the Exponential Time Hypothesis fails. To the best of our knowledge, these are the first results for a natural, fixedparameter tractable problem, and proving that a doublyexponential dependency on the parameter is essentially necessary. 1
FixedParameter Algorithms for GraphModeled Data Clustering
"... Fixedparameter algorithms can efficiently find optimal solutions to some NPhard problems, including several problems that arise in graphmodeled data clustering. This survey provides a primer about practical techniques to develop such algorithms; in particular, we discuss the design of kernelizatio ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
Fixedparameter algorithms can efficiently find optimal solutions to some NPhard problems, including several problems that arise in graphmodeled data clustering. This survey provides a primer about practical techniques to develop such algorithms; in particular, we discuss the design of kernelizations (data reductions with provable performance guarantees) and depthbounded search trees. Ourinvestigations are circumstantiated by three concrete problems from the realm of graphmodeled data clustering for which fixedparameter algorithms have been implemented and experimentally evaluated, namely CLIQUE, CLUSTER EDITING, and CLIQUE COVER.
Confluence in Data Reduction: Bridging Graph Transformation and Kernelization
, 2013
"... Kernelization is a core tool of parameterized algorithmics for coping with computationally intractable problems. A kernelization reduces in polynomial time an input instance to an equivalent instance whose size is bounded by a function only depending on some problemspecific parameter k; this new in ..."
Abstract
 Add to MetaCart
(Show Context)
Kernelization is a core tool of parameterized algorithmics for coping with computationally intractable problems. A kernelization reduces in polynomial time an input instance to an equivalent instance whose size is bounded by a function only depending on some problemspecific parameter k; this new instance is called problem kernel. Typically, problem kernels are achieved by performing efficient data reduction rules. So far, there was little systematic study in the literature concerning the mutual interaction of data reduction rules, in particular whether data reduction rules for a specific problem always lead to the same reduced instance, no matter in which order the rules are applied. This corresponds to the concept of confluence from the theory of rewriting systems. We argue that it is valuable to study whether a kernelization is confluent, using the NPhard graph problems (Edge) Clique Cover and Partial Clique Cover as running examples. We apply the concept of critical pair analysis from graph transformation theory, supported by the AGG software tool. These results support the main goal of our work, namely, to establish a fruitful link between (parameterized) algorithmics and graph transformation theory, two so far unrelated fields.