Results 1  10
of
20
FPT is Ptime extremal structure I
 Algorithms and Complexity in Durham 2005, Proceedings of the first ACiD Workshop, volume 4 of Texts in Algorithmics
, 2005
"... We describe a broad program of research in parameterized complexity, and hows this plays out for the MAX LEAF SPANNING TREE problem. ..."
Abstract

Cited by 28 (2 self)
 Add to MetaCart
(Show Context)
We describe a broad program of research in parameterized complexity, and hows this plays out for the MAX LEAF SPANNING TREE problem.
Parameterized Complexity of Vertex Cover Variants
, 2006
"... Important variants of the Vertex Cover problem (among others, Connected Vertex Cover, Capacitated Vertex Cover, and Maximum ..."
Abstract

Cited by 23 (5 self)
 Add to MetaCart
Important variants of the Vertex Cover problem (among others, Connected Vertex Cover, Capacitated Vertex Cover, and Maximum
Exact algorithms and applications for Treelike Weighted Set Cover
 JOURNAL OF DISCRETE ALGORITHMS
, 2006
"... We introduce an NPcomplete special case of the Weighted Set Cover problem and show its fixedparameter tractability with respect to the maximum subset size, a parameter that appears to be small in relevant applications. More precisely, in this practically relevant variant we require that the given ..."
Abstract

Cited by 10 (5 self)
 Add to MetaCart
(Show Context)
We introduce an NPcomplete special case of the Weighted Set Cover problem and show its fixedparameter tractability with respect to the maximum subset size, a parameter that appears to be small in relevant applications. More precisely, in this practically relevant variant we require that the given collection C of subsets of a some base set S should be “treelike.” That is, the subsets in C can be organized in a tree T such that every subset onetoone corresponds to a tree node and, for each element s of S, the nodes corresponding to the subsets containing s induce a subtree of T. This is equivalent to the problem of finding a minimum edge cover in an edgeweighted acyclic hypergraph. Our main result is an algorithm running in O(3 k ·mn) time where k denotes the maximum subset size, n: = S, and m: = C. The algorithm also implies a fixedparameter tractability result for the NPcomplete Multicut in Trees problem, complementing previous approximation results. Our results find applications in computational biology in phylogenomics and for saving memory in tree decomposition based graph algorithms.
The union of minimal hitting sets: Parameterized combinatorial bounds and counting
 24th Symposium on Theoretical Aspects of Computer Science STACS 2007, LNCS 4393
"... A khitting set in a hypergraph is a set of at most k vertices that intersects all hyperedges. We study the union of all inclusionminimal khitting sets in hypergraphs of rank r (where the rank is the maximum size of hyperedges). We show that this union is relevant for certain combinatorial inferen ..."
Abstract

Cited by 8 (2 self)
 Add to MetaCart
(Show Context)
A khitting set in a hypergraph is a set of at most k vertices that intersects all hyperedges. We study the union of all inclusionminimal khitting sets in hypergraphs of rank r (where the rank is the maximum size of hyperedges). We show that this union is relevant for certain combinatorial inference problems and give worstcase bounds on its size, depending on r and k. For r = 2 our result is tight, and for each r ≥ 3 we have an asymptotically optimal bound and make progress regarding the constant factor. The exact worstcase size for r ≥ 3 remains an open problem. We also propose an algorithm for counting all khitting sets in hypergraphs of rank r. Its asymptotic runtime matches the best one known for the much more special problem of finding one khitting set. The results are used for efficient counting of khitting sets that contain any particular vertex.
Improved Upper Bounds for Partial Vertex Cover
, 2008
"... The Partial Vertex Cover problem is to decide whether a graph contains at most k nodes covering at least t edges. We present deterministic and randomized algorithms with run times of O ∗ (1.396 t) and O ∗ (1.2993 t), respectively. For graphs of maximum degree three, we show how to solve this proble ..."
Abstract

Cited by 8 (0 self)
 Add to MetaCart
(Show Context)
The Partial Vertex Cover problem is to decide whether a graph contains at most k nodes covering at least t edges. We present deterministic and randomized algorithms with run times of O ∗ (1.396 t) and O ∗ (1.2993 t), respectively. For graphs of maximum degree three, we show how to solve this problem in O ∗ (1.26 t) steps. Finally, we give an O ∗ (3 t) algorithm for Exact Partial Vertex Cover, which asks for at most k nodes covering exactly t edges.
A selection of useful theoretical tools for the design and analysis of optimization heuristics
 Memetic Computing
, 2009
"... An intensive practical experimentation is certainly required for the purpose of heuristics design and evaluation, however a theoretical approach is also important in this area of research. This paper gives a brief description of a selection of theoretical tools that can be used for designing and ana ..."
Abstract

Cited by 7 (3 self)
 Add to MetaCart
(Show Context)
An intensive practical experimentation is certainly required for the purpose of heuristics design and evaluation, however a theoretical approach is also important in this area of research. This paper gives a brief description of a selection of theoretical tools that can be used for designing and analyzing various heuristics. For design and evaluation, we consider several examples of preprocessing procedures and probabilistic instance analysis methods. We also discuss some attempts at the theoretical explanation of successes and failures of certain heuristics. 1
Competitive Group Testing and Learning Hidden Vertex Covers with Minimum Adaptivity
"... Abstract. Suppose that we are given a set of n elements d of which are “defective”. A group test can check for any subset, called a pool, whether it contains a defective. It is well known that d defectives can be found by using O(d log n) pools. This nearly optimal number of pools can be achieved in ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
(Show Context)
Abstract. Suppose that we are given a set of n elements d of which are “defective”. A group test can check for any subset, called a pool, whether it contains a defective. It is well known that d defectives can be found by using O(d log n) pools. This nearly optimal number of pools can be achieved in 2 stages, where tests within a stage are done in parallel. But then d must be known in advance. Here we explore group testing strategies that use a nearly optimal number of pools and a few stages although d is not known to the searcher. One easily sees that O(log d) stages are sufficient for a strategy with O(d log n) pools. Here we prove a lower bound of Ω(log d / log log d) stages and a more general pools vs. stages tradeoff. As opposed to this, we devise a randomized strategy that finds d defectives using O(d log(n/d)) pools in 3 stages, with any desired probability 1 − ɛ. Open questions concern the optimal constant factors and practical implications. A related problem motivated by, e.g., biological network analysis is to learn hidden vertex covers of a small size
Vertex Cover Problem Parameterized Above and Below Tight Bounds
"... Abstract. We study the wellknown Vertex Cover problem parameterized above and below tight bounds. We show that two of the parameterizations (both were suggested by Mahajan, Raman and Sikdar, J. Computer and System Sciences, 75(2):137–153, 2009) are fixedparameter tractable and two other parameteri ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
(Show Context)
Abstract. We study the wellknown Vertex Cover problem parameterized above and below tight bounds. We show that two of the parameterizations (both were suggested by Mahajan, Raman and Sikdar, J. Computer and System Sciences, 75(2):137–153, 2009) are fixedparameter tractable and two other parameterizations are W[1]hard (one of them is, in fact, W[2]hard). 1
FPT Algorithms in Analysis of Heuristics for Extracting Networks in Linear Programs
 Proc. 4th Internation Workshop on Parameterized and Exact Computation (IWPEC 2009), Lect. Notes Comput. Sci
, 2009
"... Abstract. It often happens that although a problem is FPT, the practitioners prefer to use imprecise heuristic methods to solve the problem in the realworld situation simply because of the fact that the heuristic methods are faster. In this paper we argue that in this situation an FPT algorithm for ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
(Show Context)
Abstract. It often happens that although a problem is FPT, the practitioners prefer to use imprecise heuristic methods to solve the problem in the realworld situation simply because of the fact that the heuristic methods are faster. In this paper we argue that in this situation an FPT algorithm for the given problem may be still of a considerable practical use. In particular, the FPT algorithm can be used to evaluate the quality of approximation of heuristic approaches. To demonstrate this way of application of FPT algorithms, we consider the problem of extracting a maximumsize reflected network in a linear program. We evaluate a known heuristic SGA and its two variations, a new heuristic and an exact algorithm. The new heuristic and algorithm use fixedparameter tractable procedures. The new heuristic turned out to be of little practical interest, but the exact algorithm is of interest when the network is close in size to the linear program especially if the exact algorithm is used in conjunction with SGA. The most important conclusion is that a variant of SGA that was disregarded before due to it being slower than the other heuristics turns out to be the best choice because in
Learning to Assign Degrees of Belief in Relational Domains
"... Abstract. A recurrent question in the design of intelligent agents is how to assign degrees of beliefs, or subjective probabilities, to various events in a relational environment. In the standard knowledge representation approach, these probabilities are evaluated according to a knowledge base, such ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
Abstract. A recurrent question in the design of intelligent agents is how to assign degrees of beliefs, or subjective probabilities, to various events in a relational environment. In the standard knowledge representation approach, these probabilities are evaluated according to a knowledge base, such as a logical program or a Bayesian network. However, even for very restricted representation languages, the problem of evaluating probabilities from a knowledge base is computationally prohibitive. By contrast, this study embarks on the learning to reason (L2R) framework that aims at eliciting degrees of belief in an inductive manner. The agent is viewed as an anytime reasoner that iteratively improves its performance in light of the knowledge induced from its mistakes. By coupling exponentiated gradient strategies in online learning and weighted model counting techniques in reasoning, the L2R framework is shown to provide efficient solutions to relational probabilistic reasoning problems that are provably intractable in the classical framework. 1