Results 1  10
of
34
Bidimensionality and Kernels
, 2010
"... Bidimensionality theory appears to be a powerful framework in the development of metaalgorithmic techniques. It was introduced by Demaine et al. [J. ACM 2005] as a tool to obtain subexponential time parameterized algorithms for bidimensional problems on Hminor free graphs. Demaine and Hajiaghayi ..."
Abstract

Cited by 58 (23 self)
 Add to MetaCart
(Show Context)
Bidimensionality theory appears to be a powerful framework in the development of metaalgorithmic techniques. It was introduced by Demaine et al. [J. ACM 2005] as a tool to obtain subexponential time parameterized algorithms for bidimensional problems on Hminor free graphs. Demaine and Hajiaghayi [SODA 2005] extended the theory to obtain polynomial time approximation schemes (PTASs) for bidimensional problems. In this paper, we establish a third metaalgorithmic direction for bidimensionality theory by relating it to the existence of linear kernels for parameterized problems. In parameterized complexity, each problem instance comes with a parameter k and the parameterized problem is said to admit a linear kernel if there is a polynomial time algorithm, called
Efficient exact algorithms on planar graphs: Exploiting sphere cut branch decompositions
 IN PROCEEDINGS OF THE 13TH ANNUAL EUROPEAN SYMPOSIUM ON ALGORITHMS (ESA 2005
, 2005
"... A divideandconquer strategy based on variations of the LiptonTarjan planar separator theorem has been one of the most common approaches for solving planar graph problems for more than 20 years. We present a new framework for designing fast subexponential exact and parameterized algorithms on pla ..."
Abstract

Cited by 48 (18 self)
 Add to MetaCart
(Show Context)
A divideandconquer strategy based on variations of the LiptonTarjan planar separator theorem has been one of the most common approaches for solving planar graph problems for more than 20 years. We present a new framework for designing fast subexponential exact and parameterized algorithms on planar graphs. Our approach is based on geometric properties of planar branch decompositions obtained by Seymour & Thomas, combined with refined techniques of dynamic programming on planar graphs based on properties of noncrossing partitions. Compared to divideandconquer algorithms, the main advantages of our method are a) it is a generic method which allows to attack broad classes of problems; b) the obtained algorithms provide a better worst case analysis. To exemplify our approach we show how to obtain an O(26.903pn) time algorithm solving weighted Hamiltonian Cycle. We observe how our technique can be used to solve Planar Graph TSP in time O(29.8594pn). Our approach can be used to design parameterized algorithms as well. For example we introduce the first 2O(pk)nO(1) time algorithm for parameterized Planar kcycle by showing that for a given k we can decide if a planar graph on n vertices has a cycle of length at least k in time O(213.6pkn + n3).
Algorithmic MetaTheorems
 In M. Grohe and R. Neidermeier eds, International Workshop on Parameterized and Exact Computation (IWPEC), volume 5018 of LNCS
, 2008
"... Algorithmic metatheorems are algorithmic results that apply to a whole range of problems, instead of addressing just one specific problem. This kind of theorems are often stated relative to a certain class of graphs, so the general form of a meta theorem reads “every problem in a certain class C of ..."
Abstract

Cited by 22 (6 self)
 Add to MetaCart
(Show Context)
Algorithmic metatheorems are algorithmic results that apply to a whole range of problems, instead of addressing just one specific problem. This kind of theorems are often stated relative to a certain class of graphs, so the general form of a meta theorem reads “every problem in a certain class C of problems can be solved efficiently on every graph satisfying a certain property P”. A particularly well known example of a metatheorem is Courcelle’s theorem that every decision problem definable in monadic secondorder logic (MSO) can be decided in linear time on any class of graphs of bounded treewidth [1]. The class C of problems can be defined in a number of different ways. One option is to state combinatorial or algorithmic criteria of problems in C. For instance, Demaine, Hajiaghayi and Kawarabayashi [5] showed that every minimisation problem that can be solved efficiently on graph classes of bounded treewidth and for which approximate solutions can be computed efficiently from solutions of certain subinstances, have a PTAS on any class of graphs excluding a fixed minor. While this gives a strong unifying explanation for PTAS of many
Fast FAST
"... We present a randomized subexponential time, polynomial space parameterized algorithm for the kWeighted Feedback Arc Set in Tournaments (kFAST) problem. We also show that our algorithm can be derandomized by slightly increasing the running time. To derandomize our algorithm we construct a new kin ..."
Abstract

Cited by 16 (7 self)
 Add to MetaCart
(Show Context)
We present a randomized subexponential time, polynomial space parameterized algorithm for the kWeighted Feedback Arc Set in Tournaments (kFAST) problem. We also show that our algorithm can be derandomized by slightly increasing the running time. To derandomize our algorithm we construct a new kind of universal hash functions, that we coin universal coloring families. For integers m, k and r, a family F of functions from [m] to [r] is called a universal (m, k, r)coloring family if for any graph G on the set of vertices [m] with at most k edges, there exists an f ∈ F which is a proper vertex coloring of G. Our algorithm is the first nontrivial subexponential time parameterized algorithm outside the framework of bidimensionality.
Bidimensionality and EPTAS
"... Bidimensionality theory appears to be a powerful framework for the development of metaalgorithmic techniques. It was introduced by Demaine et al. [J. ACM 2005] as a tool to obtain subexponential time parameterized algorithms for problems on Hminor free graphs. Demaine and Hajiaghayi [SODA 2005] e ..."
Abstract

Cited by 12 (5 self)
 Add to MetaCart
(Show Context)
Bidimensionality theory appears to be a powerful framework for the development of metaalgorithmic techniques. It was introduced by Demaine et al. [J. ACM 2005] as a tool to obtain subexponential time parameterized algorithms for problems on Hminor free graphs. Demaine and Hajiaghayi [SODA 2005] extended the theory to obtain polynomial time approximation schemes (PTASs) for bidimensional problems, and subsequently improved these results to EPTASs. Fomin et. al [SODA 2010] established a third metaalgorithmic direction for bidimensionality theory by relating it to the existence of linear kernels for parameterized problems. In this paper we revisit bidimensionality theory from the perspective of approximation algorithms and redesign the framework for obtaining EPTASs to be more powerful, easier to apply and easier to understand. One of the important conditions required in the framework developed by Demaine and Hajiaghayi [SODA 2005] is that to obtain an EPTAS for a graph optimization problem Π, we have to know a constantfactor approximation algorithm for Π. Our approach eliminates this strong requirement, which makes it amenable to more problems. At the heart of our framework is a decomposition lemma which states that for “most ” bidimensional problems, there is a polynomial time algorithm which given an Hminorfree graph G as input and an ɛ> 0 outputs a vertex set X of size ɛ · OP T such that the treewidth of G \ X is O(1/ɛ). Here, OP T is the objective function value of the problem in question This allows us to obtain EPTASs on (apex)minorfree graphs for all problems covered by the previous framework, as well as for a wide range of packing problems, partial covering problems and problems that are neither closed under taking minors, nor contractions. To the best of our knowledge for many of these problems including Cycle Packing, VertexH
Contraction Bidimensionality: The Accurate Picture
 Proceedings of the 17th Annual European Symposium on Algorithms, Lecture Notes in Computer Science
, 2009
"... Abstract. We provide new combinatorial theorems on the structure of graphs that are contained as contractions in graphs of large treewidth. As a consequence of our combinatorial results we unify and significantly simplify contraction bidimensionality theory—the meta algorithmic framework to design ..."
Abstract

Cited by 9 (5 self)
 Add to MetaCart
(Show Context)
Abstract. We provide new combinatorial theorems on the structure of graphs that are contained as contractions in graphs of large treewidth. As a consequence of our combinatorial results we unify and significantly simplify contraction bidimensionality theory—the meta algorithmic framework to design efficient parameterized and approximation algorithms for contraction closed parameters. 1
Subexponential Algorithms for Partial Cover Problems
"... Partial Cover problems are optimization versions of fundamental and well studied problems like Vertex Cover and Dominating Set. Here one is interested in covering (or dominating) the maximum number of edges (or vertices) using a given number (k) of vertices, rather than covering all edges (or vertic ..."
Abstract

Cited by 7 (2 self)
 Add to MetaCart
(Show Context)
Partial Cover problems are optimization versions of fundamental and well studied problems like Vertex Cover and Dominating Set. Here one is interested in covering (or dominating) the maximum number of edges (or vertices) using a given number (k) of vertices, rather than covering all edges (or vertices). In general graphs, these problems are hard for parameterized complexity classes when parameterized by k. It was recently shown by Amini et. al. [FSTTCS 08] that Partial Vertex Cover and Partial Dominating Set are fixed parameter tractable on large classes of sparse graphs, namely Hminor free graphs, which include planar graphs and graphs of bounded genus. In particular, it was shown that on planar graphs both problems can be solved in time 2 O(k) n O(1). During the last decade there has been an extensive study on parameterized subexponential algorithms. In particular, it was shown that the classical Vertex Cover and Dominating Set problems can be solved in subexponential time on Hminor free graphs. The techniques developed to obtain subexponential algorithms for classical problems do not apply to partial cover problems. It was left as an open problem by Amini et al. [FSTTCS 08] whether there is a subexponential algorithm for Partial Vertex Cover and Partial Dominating Set. In this paper, we answer the question affirmatively by solving both problems in time 2 O( √ k) n O(1) not only on planar graphs but also on much larger classes of graphs, namely, apexminor free graphs. Compared to previously known algorithms for these problems our algorithms are significantly faster and simpler. 1
Beyond Bidimensionality: Parameterized Subexponential Algorithms on Directed Graphs
"... In 2000 Alber et al. [SWAT 2000] obtained the first parameterized subexponential algorithm on undirected planar graphs by showing that kDOMINATING SET is solvable in time 2 O( √ k) ..."
Abstract

Cited by 7 (6 self)
 Add to MetaCart
(Show Context)
In 2000 Alber et al. [SWAT 2000] obtained the first parameterized subexponential algorithm on undirected planar graphs by showing that kDOMINATING SET is solvable in time 2 O( √ k)