Results 1  10
of
61
Kernelization of Packing Problems
, 2011
"... Kernelization algorithms are polynomialtime reductions from a problem to itself that guarantee their output to have a size not exceeding some bound. For example, dSet Matching for integers d ≥ 3 is the problem of nding a matching of size at least k in a given duniform hypergraph and has kernels w ..."
Abstract

Cited by 20 (2 self)
 Add to MetaCart
Kernelization algorithms are polynomialtime reductions from a problem to itself that guarantee their output to have a size not exceeding some bound. For example, dSet Matching for integers d ≥ 3 is the problem of nding a matching of size at least k in a given duniform hypergraph and has kernels with O(k d) edges. Recently, Bodlaender et al. [ICALP 2008], Fortnow and Santhanam [STOC 2008], Dell and Van Melkebeek [STOC 2010] developed a framework for proving lower bounds on the kernel size for certain problems, under the complexitytheoretic hypothesis that coNP is not contained in NP/poly. Under the same hypothesis, we show lower bounds for the kernelization of dSet Matching and other packing problems. Our bounds are tight for dSet Matching: It does not have kernels with O(k d−ɛ) edges for any ɛ> 0 unless the hypothesis fails. By reduction, this transfers to a bound of O(k d−1−ɛ) for the problem of nding k vertexdisjoint cliques of size d in standard graphs. It is natural to ask for tight bounds on the kernel sizes of such graph packing problems. We make rst progress in that direction by showing nontrivial kernels with O(k 2.5) edges for the problem of nding k vertexdisjoint paths of three edges each. This does not quite match the best lower bound of O(k 2−ɛ) that we can prove. Most of our lower bound proofs follow a general scheme that we discover: To exclude kernels of size O(k d−ɛ) for a problem in duniform hypergraphs, one should reduce from a carefully chosen dpartite problem that is still NPhard. As an illustration, we apply this scheme to the vertex cover problem, which allows us to replace the numbertheoretical construction by Dell and Van Melkebeek [STOC 2010] with shorter elementary arguments. 1
Compression via Matroids: A Randomized Polynomial Kernel for Odd Cycle Transversal
"... The Odd Cycle Transversal problem (OCT) asks whether a given graph can be made bipartite by deleting at most k of its vertices. In a breakthrough result Reed, Smith, and Vetta (Operations Research Letters, 2004) gave a O(4 k kmn) time algorithm for it, the first algorithm with polynomial runtime of ..."
Abstract

Cited by 19 (4 self)
 Add to MetaCart
(Show Context)
The Odd Cycle Transversal problem (OCT) asks whether a given graph can be made bipartite by deleting at most k of its vertices. In a breakthrough result Reed, Smith, and Vetta (Operations Research Letters, 2004) gave a O(4 k kmn) time algorithm for it, the first algorithm with polynomial runtime of uniform degree for every fixed k. It is known that this implies a polynomialtime compression algorithm that turns OCT instances into equivalent instances of size at most O(4 k), a socalled kernelization. Since then the existence of a polynomial kernel for OCT, i.e., a kernelization with size bounded polynomially in k, has turned into one of the main open questions in the study of kernelization. Despite the impressive progress in the area, including the recent development of lower bound techniques (Bodlaender
PlanarF Deletion: Approximation, Kernelization and Optimal FPT Algorithms
"... Let F be a finite set of graphs. In the FDeletion problem, we are given an nvertex graph G and an integer k as input, and asked whether at most k vertices can be deleted from G such that the resulting graph does not contain a graph from F as a minor. FDeletion is a generic problem and by selectin ..."
Abstract

Cited by 18 (8 self)
 Add to MetaCart
(Show Context)
Let F be a finite set of graphs. In the FDeletion problem, we are given an nvertex graph G and an integer k as input, and asked whether at most k vertices can be deleted from G such that the resulting graph does not contain a graph from F as a minor. FDeletion is a generic problem and by selecting different sets of forbidden minors F, one can obtain various fundamental problems such as Vertex Cover, Feedback Vertex Set or Treewidth ηDeletion. In this paper we obtain a number of generic algorithmic results about FDeletion, when F contains at least one planar graph. The highlights of our work are • A constant factor approximation algorithm for the optimization version of FDeletion; • A linear time and single exponential parameterized algorithm, that is, an algorithm running in time O(2 O(k) n), for the parameterized version of FDeletion where all graphs in F are connected; • A polynomial kernel for parameterized FDeletion. These algorithms unify, generalize, and improve a multitude of results in the literature. Our main results have several direct applications, but also the methods we develop on the way have applicability beyond the scope of this paper. Our results – constant factor approximation, polynomial kernelization and FPT algorithms – are stringed together by a common theme of polynomial time preprocessing.
Linear kernels and singleexponential algorithms via protrusion decompositions
, 2012
"... A ttreewidthmodulator of a graph G is a set X ⊆ V (G) such that the treewidth of G−X is at most t − 1. In this paper, we present a novel algorithm to compute a decomposition scheme for graphs G that come equipped with a ttreewidthmodulator. Similar decompositions have already been explicitly or ..."
Abstract

Cited by 16 (5 self)
 Add to MetaCart
A ttreewidthmodulator of a graph G is a set X ⊆ V (G) such that the treewidth of G−X is at most t − 1. In this paper, we present a novel algorithm to compute a decomposition scheme for graphs G that come equipped with a ttreewidthmodulator. Similar decompositions have already been explicitly or implicitly used for obtaining polynomial kernels [3, 7, 33, 43]. Our decomposition, called a protrusion decomposition, is the cornerstone in obtaining the following two main results. Our first result is that any parameterized graph problem (with parameter k) that has finite integer index and is treewidthbounding admits a linear kernel on the class of Htopologicalminorfree graphs, where H is some arbitrary but fixed graph. A parameterized graph problem is called treewidthbounding if all positive instances have a ttreewidthmodulator of size O(k), for some constant t. This result partially extends previous metatheorems on the existence of linear kernels on graphs of bounded genus [7] and Hminorfree graphs [37]. In particular, we show that Chordal Vertex Deletion, Interval Vertex Deletion, Treewidtht Vertex Deletion, and Edge Dominating Set have linear kernels on Htopologicalminorfree graphs.
On brambles, gridlike minors, and parameterized intractability of monadic secondorder logic
"... Brambles were introduced as the dual notion to treewidth, one of the most central concepts of the graph minor theory of Robertson and Seymour. Recently, Grohe and Marx showed that there are graphs G, in which every bramble of order larger than the square root of the treewidth is of exponential size ..."
Abstract

Cited by 12 (4 self)
 Add to MetaCart
(Show Context)
Brambles were introduced as the dual notion to treewidth, one of the most central concepts of the graph minor theory of Robertson and Seymour. Recently, Grohe and Marx showed that there are graphs G, in which every bramble of order larger than the square root of the treewidth is of exponential size in G. On the positive side, they show the existence of polynomialsized brambles of the order of the square root of the treewidth, up to log factors. We provide the first polynomial time algorithm to construct a bramble in general graphs and achieve this bound, up to logfactors. We use this algorithm to construct gridlike minors, a replacement structure for gridminors recently introduced by Reed and Wood, in polynomial time. Using the gridlike
Hitting forbidden minors: Approximation and kernelization
 IN PROCEEDINGS OF THE 8TH INTERNATIONAL SYMPOSIUM ON THEORETICAL ASPECTS OF COMPUTER SCIENCE (STACS 2011
"... We study a general class of problems called FDeletion problems. In an FDeletion problem, we are asked whether a subset of at most k vertices can be deleted from a graph G such that the resulting graph does not contain as a minor any graph from the family F of forbidden minors. We obtain a number o ..."
Abstract

Cited by 12 (6 self)
 Add to MetaCart
(Show Context)
We study a general class of problems called FDeletion problems. In an FDeletion problem, we are asked whether a subset of at most k vertices can be deleted from a graph G such that the resulting graph does not contain as a minor any graph from the family F of forbidden minors. We obtain a number of algorithmic results on the FDeletion problem when F contains a planar graph. We give • a linear vertex kernel on graphs excluding tclaw K1,t, the star with t leves, as an induced subgraph, where t is a fixed integer. • an approximation algorithm achieving an approximation ratio of O(log 3/2 OPT), where OPT is the size of an optimal solution on general undirected graphs. Finally, we obtain polynomial kernels for the case when F contains graph θc as a minor for a fixed integer c. The graph θc consists of two vertices connected by c parallel edges. Even though this may appear to be a very restricted class of problems it already encompasses wellstudied problems such as Vertex Cover, Feedback Vertex Set and Diamond Hitting Set. The generic kernelization algorithm is based on a nontrivial application of protrusion techniques, previously used only for problems on topological graph classes.
Bidimensionality and EPTAS
"... Bidimensionality theory appears to be a powerful framework for the development of metaalgorithmic techniques. It was introduced by Demaine et al. [J. ACM 2005] as a tool to obtain subexponential time parameterized algorithms for problems on Hminor free graphs. Demaine and Hajiaghayi [SODA 2005] e ..."
Abstract

Cited by 12 (5 self)
 Add to MetaCart
(Show Context)
Bidimensionality theory appears to be a powerful framework for the development of metaalgorithmic techniques. It was introduced by Demaine et al. [J. ACM 2005] as a tool to obtain subexponential time parameterized algorithms for problems on Hminor free graphs. Demaine and Hajiaghayi [SODA 2005] extended the theory to obtain polynomial time approximation schemes (PTASs) for bidimensional problems, and subsequently improved these results to EPTASs. Fomin et. al [SODA 2010] established a third metaalgorithmic direction for bidimensionality theory by relating it to the existence of linear kernels for parameterized problems. In this paper we revisit bidimensionality theory from the perspective of approximation algorithms and redesign the framework for obtaining EPTASs to be more powerful, easier to apply and easier to understand. One of the important conditions required in the framework developed by Demaine and Hajiaghayi [SODA 2005] is that to obtain an EPTAS for a graph optimization problem Π, we have to know a constantfactor approximation algorithm for Π. Our approach eliminates this strong requirement, which makes it amenable to more problems. At the heart of our framework is a decomposition lemma which states that for “most ” bidimensional problems, there is a polynomial time algorithm which given an Hminorfree graph G as input and an ɛ> 0 outputs a vertex set X of size ɛ · OP T such that the treewidth of G \ X is O(1/ɛ). Here, OP T is the objective function value of the problem in question This allows us to obtain EPTASs on (apex)minorfree graphs for all problems covered by the previous framework, as well as for a wide range of packing problems, partial covering problems and problems that are neither closed under taking minors, nor contractions. To the best of our knowledge for many of these problems including Cycle Packing, VertexH
Conondeterminism in compositions: A kernelization lower bound for a Ramseytype problem
, 2012
"... Until recently, techniques for obtaining lower bounds for kernelization were one of the most sought after tools in the field of parameterized complexity. Now, after a strong influx of techniques, we are in the fortunate situation of having tools available that are even stronger than what has been re ..."
Abstract

Cited by 10 (2 self)
 Add to MetaCart
(Show Context)
Until recently, techniques for obtaining lower bounds for kernelization were one of the most sought after tools in the field of parameterized complexity. Now, after a strong influx of techniques, we are in the fortunate situation of having tools available that are even stronger than what has been required in their applications so far. Based on a result of Fortnow and Santhanam (STOC 2008, JCSS 2011), Bodlaender et al. (ICALP 2008, JCSS 2009) showed that, unless NP ⊆ coNP/poly, the existence of a deterministic polynomialtime composition algorithm, i.e., an algorithm which outputs an instance of bounded parameter value which is yes if and only if one of t input instances is yes, rules out the existence of polynomial kernels for a problem. Dell and van Melkebeek (STOC 2010) continued this line
Implicit Branching and Parameterized Partial Cover Problems
 IN PROC. OF IARCS CONFERENCE ON FOUNDATIONS OF SOFTWARE TECHNOLOGY AND THEORETICAL COMPUTER SCIENCE (FSTTCS), LEIBNIZ INTERNATIONAL PROCEEDINGS IN INFORMATICS, SCHLOSS DAGSTUHL–LEIBNIZZENTRUM FUER INFORMATIK
"... Covering problems are fundamental classical problems in optimization, computer science and complexity theory. Typically an input to these problems is a family of sets over a finite universe and the goal is to cover the elements of the universe with as few sets of the family as possible. The variatio ..."
Abstract

Cited by 8 (1 self)
 Add to MetaCart
(Show Context)
Covering problems are fundamental classical problems in optimization, computer science and complexity theory. Typically an input to these problems is a family of sets over a finite universe and the goal is to cover the elements of the universe with as few sets of the family as possible. The variations of covering problems include well known problems like Set Cover, Vertex Cover, Dominating Set and Facility Location to name a few. Recently there has been a lot of study on partial covering problems, a natural generalization of covering problems. Here, the goal is not to cover all the elements but to cover the specified number of elements with the minimum number of sets. In this paper we study partial covering problems in graphs in the realm of parameterized complexity. Classical (nonpartial) version of all these problems have been intensively studied in planar graphs and in graphs excluding a fixed graph H as a minor. However, the techniques developed for parameterized version of nonpartial covering problems cannot be applied directly to their partial counterparts. The approach we use, to show that various partial covering problems are fixed parameter tractable on planar graphs, graphs of bounded local treewidth and graph excluding some graph as a minor, is quite different from previously known techniques. The main idea behind our approach is the concept of implicit branching. We find implicit branching technique to be interesting on its own and believe that it can be used for some other problems.