Results 1  10
of
51
On the Complexity Analysis of Static Analyses
 Journal of the ACM
, 1999
"... . This paper argues that for many algorithms, and static analysis ..."
Abstract

Cited by 75 (3 self)
 Add to MetaCart
(Show Context)
. This paper argues that for many algorithms, and static analysis
UserDefinable Resource Bounds Analysis for Logic Programs
 In ICLP’07, number 4670 in LNCS
, 2007
"... Abstract. We present a static analysis that infers both upper and lower bounds on the usage that a logic program makes of a set of userdefinable resources. The inferred bounds will in general be functions of input data sizes. A resource in our approach is a quite general, userdefined notion which ..."
Abstract

Cited by 36 (17 self)
 Add to MetaCart
(Show Context)
Abstract. We present a static analysis that infers both upper and lower bounds on the usage that a logic program makes of a set of userdefinable resources. The inferred bounds will in general be functions of input data sizes. A resource in our approach is a quite general, userdefined notion which associates a basic cost function with elementary operations. The analysis then derives the related (upper and lowerbound) resource usage functions for all predicates in the program. We also present an assertion language which is used to define both such resources and resourcerelated properties that the system can then check based on the results of the analysis. We have performed some preliminary experiments with some concrete resources such as execution steps, bytes sent or received by an application, number of files left open, number of accesses to a database, number of calls to a procedure, number of asserts/retracts, etc. Applications of our analysis include resource consumption verification and debugging (including for mobile code), resource control in parallel/distributed computing, and resourceoriented specialization. 1
A New Method for Consequence Finding and Compilation in Restricted Languages
, 1999
"... SFK (skipfiltered, kernel) resolution is a new method for finding "interesting" consequences of a first order clausal theory \Sigma, namely those in some restricted target language LT . In its more restrictive form, SFK resolution corresponds to a relatively efficient SAT method, directio ..."
Abstract

Cited by 26 (4 self)
 Add to MetaCart
SFK (skipfiltered, kernel) resolution is a new method for finding "interesting" consequences of a first order clausal theory \Sigma, namely those in some restricted target language LT . In its more restrictive form, SFK resolution corresponds to a relatively efficient SAT method, directional resolution; in its more general form, to a full prime implicate algorithm, namely Tison 's. It generalizes both of them by offering much more flexible search, first order completeness, and a much wider range of inferential capabilities. SFK resolution has many applications: computing "characteristic" clauses for taskspecific languages in abduction, explanation and nonmonotonic reasoning (Inoue 1992); obtaining LUB approximations of the input theory (Selman and Kautz 1996) which are of polynomial size; incremental and lazy exact knowledge compilation (del Val 1994); and compilation into a tractable form for restricted target languages, independently of the tractability of inference in the given ...
Relating Semantic and ProofTheoretic Concepts for Polynomial Time Decidability of Uniform Word Problems
 In Proceedings 16th IEEE Symposium on Logic in Computer Science, LICS'2001
, 2001
"... In this paper we compare three approaches to polynomial time decidability for uniform word problems for quasivarieties. Two of the approaches, by Evans and Burris, respectively, are semantical, referring to certain embeddability and axiomatizability properties. The third approach is more prooftheor ..."
Abstract

Cited by 26 (2 self)
 Add to MetaCart
In this paper we compare three approaches to polynomial time decidability for uniform word problems for quasivarieties. Two of the approaches, by Evans and Burris, respectively, are semantical, referring to certain embeddability and axiomatizability properties. The third approach is more prooftheoretic in nature, inspired by McAllester's concept of local inference. We define two closely related notions of locality for equational Horn theories and show that both the criteria by Evans and Burris lie in between these two concepts. In particular, the variant we call stable locality will be shown to subsume both Evans' and Burris' method.
Polynomialtime Computation via Local Inference Relations
 ACM TRANS. COMPUT. LOGIC
, 2000
"... We consider the concept of a local set of inference rules. A local rule set can be automatically transformed into a rule set for which bottomup evaluation terminates in polynomial time. The localruleset transformation gives polynomialtime evaluation strategies for a large variety of rule sets th ..."
Abstract

Cited by 23 (0 self)
 Add to MetaCart
We consider the concept of a local set of inference rules. A local rule set can be automatically transformed into a rule set for which bottomup evaluation terminates in polynomial time. The localruleset transformation gives polynomialtime evaluation strategies for a large variety of rule sets that cannot be given terminating evaluation strategies by any other known automatic technique. This paper discusses three new results. First, it is shown that every polynomialtime predicate can be defined by an (unstratified) local rule set. Second, a new machinerecognizable subclass of the local rule sets is identified. Finally we show that locality, as a property of rule sets, is undecidable in general.
Normalizable Horn Clauses, Strongly Recognizable Relations and Spi
"... We exhibit a rich class of Horn clauses, which we call H1 , whose least models, though possibly infinite, can be computed effectively. We show that ..."
Abstract

Cited by 20 (2 self)
 Add to MetaCart
(Show Context)
We exhibit a rich class of Horn clauses, which we call H1 , whose least models, though possibly infinite, can be computed effectively. We show that
Automatic Complexity Analysis
, 2002
"... We consider the problem of automating the derivation of tight asymptotic complexity bounds for solving Horn clauses. Clearly, the solving time crucially depends on the "sparseness" of the computed relations. Therefore, our asymptotic runtime analysis is accompanied by an asymptotic sparsit ..."
Abstract

Cited by 17 (5 self)
 Add to MetaCart
We consider the problem of automating the derivation of tight asymptotic complexity bounds for solving Horn clauses. Clearly, the solving time crucially depends on the "sparseness" of the computed relations. Therefore, our asymptotic runtime analysis is accompanied by an asymptotic sparsity calculus together with an asymptotic sparsity analysis. The technical problem here is that least fixpoint iteration fails on asymptotic complexity expressions: the intuitive reason is that O(1)+ O(1) = O(1) but O(1) + +O(1) may return any value.
Automatic Decidability
 In Proc. LICS17
, 2002
"... We give a set of inference rules with constant constraints. Then we show how to extend a set of equational clauses, so that if the application of these inference rules halts on these clauses, then the theory is decidable by applying a standard set of Paramodulation inference rules. In addition, we c ..."
Abstract

Cited by 16 (0 self)
 Add to MetaCart
(Show Context)
We give a set of inference rules with constant constraints. Then we show how to extend a set of equational clauses, so that if the application of these inference rules halts on these clauses, then the theory is decidable by applying a standard set of Paramodulation inference rules. In addition, we can determine the number of clauses generated in this decision procedure. For some theories, such as the theory of lists, there are O(n \Theta lg(n)) clauses. For others it is polynomial. And for others it is simply exponential such as the theory of (extensional) arrays.