Results 1  10
of
50
The NPcompleteness column: an ongoing guide
 JOURNAL OF ALGORITHMS
, 1987
"... This is the nineteenth edition of a (usually) quarterly column that covers new developments in the theory of NPcompleteness. The presentation is modeled on that used by M. R. Garey and myself in our book "Computers and Intractability: A Guide to the Theory of NPCompleteness," W. H. Freem ..."
Abstract

Cited by 239 (0 self)
 Add to MetaCart
(Show Context)
This is the nineteenth edition of a (usually) quarterly column that covers new developments in the theory of NPcompleteness. The presentation is modeled on that used by M. R. Garey and myself in our book "Computers and Intractability: A Guide to the Theory of NPCompleteness," W. H. Freeman & Co., New York, 1979 (hereinafter referred to as "[G&J]"; previous columns will be referred to by their dates). A background equivalent to that provided by [G&J] is assumed, and, when appropriate, crossreferences will be given to that book and the list of problems (NPcomplete and harder) presented there. Readers who have results they would like mentioned (NPhardness, PSPACEhardness, polynomialtimesolvability, etc.) or open problems they would like publicized, should
Inference of haplotypes from samples of diploid populations: complexity and algorithms
 Journal of Computational Biology
"... The next phase of human genomics will involve largescale screens of populations for significant DNA polymorphisms, notably single nucleotide polymorphisms (SNPs). Dense human SNP maps are currently under construction. However, the utility of those maps and screens will be limited by the fact that ..."
Abstract

Cited by 81 (3 self)
 Add to MetaCart
(Show Context)
The next phase of human genomics will involve largescale screens of populations for significant DNA polymorphisms, notably single nucleotide polymorphisms (SNPs). Dense human SNP maps are currently under construction. However, the utility of those maps and screens will be limited by the fact that humans are diploid and it is presently dif cult to get separate data on the two “copies. ” Hence, genotype (blended) SNP data will be collected, and the desired haplotype (partitioned) data must then be (partially) inferred. A particular nondeterministic inference algorithm was proposed and studied by Clark (1990) and extensively used by Clark et al. (1998). In this paper, we more closely examine that inference method and the question of whether we can obtain an ef cient, deterministic variant to optimize the obtained inferences. We show that the problem is NPhard and, in fact, MaxSNP complete; that the reduction creates problem instances conforming to a severe restriction believed to hold in real data (Clark, 1990); and that even if we rst use a natural exponentialtime operation, the remaining optimization problem is NPhard. However, we also develop, implement, and test an approach based on that operation and (integer) linear programming. The approach works quickly and correctly on simulated data.
Highly Secure and Efficient Routing
 IN PROC. IEEE INFOCOM 2004, HONG KONG
, 2004
"... In this paper, we consider the problem of routing in an adversarial environment, where a sophisticated adversary has penetrated arbitrary parts of the routing infrastructure and attempts to disrupt routing. We present protocols that are able to route packets as long as at least one nonfaulty path e ..."
Abstract

Cited by 56 (3 self)
 Add to MetaCart
In this paper, we consider the problem of routing in an adversarial environment, where a sophisticated adversary has penetrated arbitrary parts of the routing infrastructure and attempts to disrupt routing. We present protocols that are able to route packets as long as at least one nonfaulty path exists between the source and the destination. These protocols have low communication overhead, low processing requirements, low incremental cost, and fast fault detection. We also present extensions to the protocols that penalize adversarial routers by blocking their traffic.
Refining Data Flow Information using Infeasible Paths
, 1997
"... . Experimental evidence indicates that large programs exhibit significant amount of branch correlation amenable to compiletime detection. Branch correlation gives rise to infeasible paths, which in turn make data flow information overly conservative. For example, defuse pairs that always span infe ..."
Abstract

Cited by 45 (6 self)
 Add to MetaCart
. Experimental evidence indicates that large programs exhibit significant amount of branch correlation amenable to compiletime detection. Branch correlation gives rise to infeasible paths, which in turn make data flow information overly conservative. For example, defuse pairs that always span infeasible paths cannot be tested by any program input, preventing 100% defuse testing coverage. We present an algorithm for identifying infeasible program paths and a data flow analysis technique that improves the precision of traditional defuse pair analysis by incorporating the information about infeasible paths into the analysis. Infeasible paths are computed using branch correlation analysis, which can be performed either intra or interprocedurally. The efficiency of our technique is achieved through demanddriven formulation of both the infeasible paths detection and the defuse pair analysis. Our experiments indicate that even when a simple form of intraprocedural branch correlation i...
Alldupath Coverage for Parallel Programs
 ACM SigSoft International Symposium on Software Testing and Analysis
, 1998
"... One significant challenge in bringing the power of parallel machines to application programmers is providing them with a suite of software tools similar to the tools that sequential programmers currently utilize. In particular, automatic or semiautomatic testing tools for parallel programs are lack ..."
Abstract

Cited by 31 (2 self)
 Add to MetaCart
(Show Context)
One significant challenge in bringing the power of parallel machines to application programmers is providing them with a suite of software tools similar to the tools that sequential programmers currently utilize. In particular, automatic or semiautomatic testing tools for parallel programs are lacking. This paper describes our work in automatic generation of alldupaths for testing parallel programs. Our goal is to demonstrate that, with some extension, sequential test data adequacy criteria are still applicable to parallel program testing. The concepts and algorithms in this paper have been incorporated as the foundation of our DELaware PArallel Software Testing Aid, della pasta. Keywords: parallel programming, testing tool, alldupath coverage 1 Introduction Recent trends in computer architecture and computer networks suggest that parallelism will pervade workstations, personal computers, and network clusters, causing parallelism to become available to more than just the users ...
Polynomially Bounded Minimization Problems That Are Hard To Approximate
, 1994
"... Min PB is the class of minimization problems whose objective functions are bounded by a polynomial in the size of the input. We show that there exist several problems that are Min PBcomplete with respect to an approximation preserving reduction. These problems are very hard to approximate; in polyn ..."
Abstract

Cited by 26 (5 self)
 Add to MetaCart
Min PB is the class of minimization problems whose objective functions are bounded by a polynomial in the size of the input. We show that there exist several problems that are Min PBcomplete with respect to an approximation preserving reduction. These problems are very hard to approximate; in polynomial time they cannot be approximated within n " for some " ? 0, where n is the size of the input, provided that P 6= NP. In particular, the problem of finding the minimum independent dominating set in a graph, the problem of satisfying a 3SAT formula setting the least number of variables to one, and the minimum bounded 0 \Gamma 1 programming problem are shown to be Min PBcomplete. We also present a new type of approximation preserving reduction that is designed for problems whose approximability is expressed as a function of some size parameter. Using this reduction we obtain good lower bounds on the approximability of the treated problems.
Bandwidth Reservation in Multihop Wireless Networks: Complexity, Mechanisms and Heuristics
 International Journal of Wireless and Mobile Computing
, 2004
"... We prove that link interferences in multihop wireless networks make the problem of selecting a path satisfying bandwidth requirements an NPcomplete problem, even under simplified rules for bandwidth reservation. This is in sharp contrast to path selection in wireline networks where efficient polyno ..."
Abstract

Cited by 16 (0 self)
 Add to MetaCart
(Show Context)
We prove that link interferences in multihop wireless networks make the problem of selecting a path satisfying bandwidth requirements an NPcomplete problem, even under simplified rules for bandwidth reservation. This is in sharp contrast to path selection in wireline networks where efficient polynomial algorithms exist.
Automatic mutation test case generation via dynamic symbolic execution.
 In Proceedings of the 21st International Symposium on Software Reliability Engineering (ISSRE’10),
, 2010
"... AbstractThe automatic test case generation is the principal issue of the software testing activity. Dynamic symbolic execution appears to be a promising approach to this matter as it has been shown to be quite powerful in producing the sought tests. Despite its power, it has only been effectively ..."
Abstract

Cited by 13 (2 self)
 Add to MetaCart
(Show Context)
AbstractThe automatic test case generation is the principal issue of the software testing activity. Dynamic symbolic execution appears to be a promising approach to this matter as it has been shown to be quite powerful in producing the sought tests. Despite its power, it has only been effectively applied to the entry level criteria of the structural criteria hierarchy such as branch testing. In this paper an extension of this technique is proposed in order to effectively generate test data based on mutation testing. The proposed approach conjoins program transformation and dynamic symbolic execution techniques in order to successfully automate the test generation process. The propositions made in this paper have been incorporated into an automated framework for producing mutation based test cases. Its evaluation on a set of benchmark programs suggests that it is able to produce tests capable of killing most of the non equivalent introduced mutants. The same study also provides some evidence that by employing efficient heuristics it can be possible to perform mutation with reasonable resources.
The class of problems that are linearly equivalent to Satisfiability or a uniform method for proving NPcompleteness
, 1995
"... We widely extend the class of problems that are linearly equivalent to Satisfiability. We show that many natural combinatorial problems are lineartime equivalent to Satisfiability (SATequivalent). We prove that the following problems are SATequivalent: 3Colorability, Path with Forbidden ..."
Abstract

Cited by 13 (0 self)
 Add to MetaCart
We widely extend the class of problems that are linearly equivalent to Satisfiability. We show that many natural combinatorial problems are lineartime equivalent to Satisfiability (SATequivalent). We prove that the following problems are SATequivalent: 3Colorability, Path with Forbidden
Minimum Diameter Covering Problems
, 1997
"... A set V and a collection of (possibly nondisjoint) subsets are given. Also given is a real matrix describing distances between elements of V . A cover is a subset of V containing at least one representative from each subset. The multiplechoice minimum diameter problem is to select a cover of minim ..."
Abstract

Cited by 11 (1 self)
 Add to MetaCart
(Show Context)
A set V and a collection of (possibly nondisjoint) subsets are given. Also given is a real matrix describing distances between elements of V . A cover is a subset of V containing at least one representative from each subset. The multiplechoice minimum diameter problem is to select a cover of minimum diameter. The diameter is defined as the maximum distance of any pair of elements in the cover. The multiplechoice dispersion problem, which is closely related, asks us to maximize the minimum distance between any pair of elements in the cover. The problems are NPhard. We present polynomial time algorithms for approximating special cases and generalizations of these basic problems, and we prove in other cases that no such algorithms exist (assuming P 6= NP ). 1 Introduction This paper deals with a class of multiplechoice problems. We assume that a set V of elements is given, together with subsets S 1 ; :::; Sm . A cover is a subset of V that contains at least one representative from ...