Results 11  20
of
55
Deciding QuantifierFree Presburger Formulas Using Finite Instantiation Based on Parameterized Solution Bounds
 In Proc. 19 th LICS. IEEE
, 2003
"... Given a formula # in quantifierfree Presburger arithmetic, it is well known that, if there is a satisfying solution to #, there is one whose size, measured in bits, is polynomially bounded in the size of #. In this paper, we consider a special class of quantifierfree Presburger formulas in which m ..."
Abstract

Cited by 35 (6 self)
 Add to MetaCart
Given a formula # in quantifierfree Presburger arithmetic, it is well known that, if there is a satisfying solution to #, there is one whose size, measured in bits, is polynomially bounded in the size of #. In this paper, we consider a special class of quantifierfree Presburger formulas in which most linear constraints are separation (di#erencebound) constraints, and the nonseparation constraints are sparse. This class has been observed to commonly occur in software verification problems. We derive a new solution bound in terms of parameters characterizing the sparseness of linear constraints and the number of nonseparation constraints, in addition to traditional measures of formula size. In particular, the number of bits needed per integer variable is linear in the number of nonseparation constraints and logarithmic in the number and size of nonzero coe#cients in them, but is otherwise independent of the total number of linear constraints in the formula. The derived bound can be used in a decision procedure based on instantiating integer variables over a finite domain and translating the input quantifierfree Presburger formula to an equisatisfiable Boolean formula, which is then checked using a Boolean satisfiability solver. We present empirical evidence indicating that this method can greatly outperform other decision procedures.
Correctness of Pipelined Machines
 Formal Methods in ComputerAided Design–FMCAD 2000, volume 1954 of LNCS
"... The correctness of pipelined machines is a subject that has been studied extensively. Most of the recent work has used variants of the Burch and Dill notion of correctness [4]. As new features are modeled, e.g., interrupts, new notions of correctness are developed. Given the plethora of correctness ..."
Abstract

Cited by 32 (13 self)
 Add to MetaCart
(Show Context)
The correctness of pipelined machines is a subject that has been studied extensively. Most of the recent work has used variants of the Burch and Dill notion of correctness [4]. As new features are modeled, e.g., interrupts, new notions of correctness are developed. Given the plethora of correctness conditions, the question arises: what is a reasonable notion of correctness? We discuss the issue at length and show, by mechanical proof, that variants of the Burch and Dill notion of correctness are awed. We propose a notion of correctness based on WEBs (Wellfounded Equivalence Bisimulations) [16, 19]. Briey, our notion of correctness implies that the ISA (Instruction Set Architecture) and MA (MicroArchitecture) machines have the same observable in nite paths, up to stuttering. This implies that the two machines satisfy the same CTL* X properties and the same safety and liveness properties (up to stuttering). To test the utility of the idea, we use ACL2 to verify s...
VOC: A Translation Validator for Optimizing Compilers.
 Electronic Notes in Theoretical Computer Science,
, 2002
"... ..."
On Solving Presburger and Linear Arithmetic with SAT
 In Proc. of Formal Methods in ComputerAided Design (FMCAD 2002), LNCS
, 2002
"... We show a reduction to propositional logic from quantifierfree Presburger arithmetic, and disjunctive linear arithmetic, based on FourierMotzkin elimination. While the complexity of this procedure is not better than competing techniques, it has practical advantages in solving verification problems ..."
Abstract

Cited by 27 (2 self)
 Add to MetaCart
(Show Context)
We show a reduction to propositional logic from quantifierfree Presburger arithmetic, and disjunctive linear arithmetic, based on FourierMotzkin elimination. While the complexity of this procedure is not better than competing techniques, it has practical advantages in solving verification problems. It also promotes the option of deciding a combination of theories by reducing them to this logic.
VOC: A methodology for the translation validation of optimizing compilers
 Journal of Universal Computer Science
, 2003
"... Abstract: There is a growing awareness, both in industry and academia, of the crucial role of formally verifying the translation from highlevel sourcecode into lowlevel object code that is typically performed by an optimizing compiler. Formally verifying an optimizing compiler, as one would verif ..."
Abstract

Cited by 25 (3 self)
 Add to MetaCart
(Show Context)
Abstract: There is a growing awareness, both in industry and academia, of the crucial role of formally verifying the translation from highlevel sourcecode into lowlevel object code that is typically performed by an optimizing compiler. Formally verifying an optimizing compiler, as one would verify any other large program, is not feasible due to its size, ongoing evolution and modification, and, possibly, proprietary considerations. Translation validation is a novel approach that offers an alternative to the verification of translators in general and compilers in particular: Rather than verifying the compiler itself, one constructs a validation tool which, after every run of the compiler, formally confirms that the target code produced in the run is a correct translation of the source program. The paper presents voc, a methodology for the translation validation of optimizing compilers. We distinguish between structure preserving optimizations, for which we establish a simulation relation between the source and target code based on computational induction, and structure modifying optimizations, for which we develop specialized “permutation rules”. The paper also describes voc64—a prototype translation validator tool that automatically produces verification conditions for the global optimizations of the SGI Pro64 compiler.
Checking Validity of QuantifierFree Formulas in Combinations of FirstOrder Theories
, 2003
"... ii ..."
(Show Context)
Translation Validation: From SIGNAL to C
 Proceedings of Conference on Correct System Design, E. R. Olderog and B. Steffen, (Eds.), LNCS 1710, SpringerVerlag
, 1999
"... . Translation validation is an alternative to the verification of translators (compilers, code generators). Rather than proving in advance that the compiler always produces a target code which correctly implements the source code (compiler verification), each individual translation (i.e. a run of th ..."
Abstract

Cited by 22 (1 self)
 Add to MetaCart
(Show Context)
. Translation validation is an alternative to the verification of translators (compilers, code generators). Rather than proving in advance that the compiler always produces a target code which correctly implements the source code (compiler verification), each individual translation (i.e. a run of the compiler) is followed by a validation phase which verifies that the target code produced on this run correctly implements the submitted source program. In order to be a practical alternative to compiler verification, a key feature of this validation is its full automation. Since the validation process attempts to "unravel" the transformation effected by the translators, its task becomes increasingly more difficult (and necessary) with the increase of sophistication and variety of the optimizations methods employed by the translator. In this paper we address the practicability of translation validation for highly optimizing, industrial code generators from Signal, a widely used synchronous...
EVC: A Validity Checker for the Logic of Equality with Uninterpreted Functions and Memories, Exploiting Positive Equality, and Conservative Transformations
 AND CONSERVATIVE TRANSFORMATIONS,” 3 COMPUTERAIDED VERIFICATION (CAV ’01
, 2001
"... The property of Positive Equality [2] dramatically speeds up validity checking of formulas in the logic of Equality with Uninterpreted Functions and Memories (EUFM) [4]. The logic expresses correctness of highlevel microprocessors. We present ..."
Abstract

Cited by 17 (12 self)
 Add to MetaCart
The property of Positive Equality [2] dramatically speeds up validity checking of formulas in the logic of Equality with Uninterpreted Functions and Memories (EUFM) [4]. The logic expresses correctness of highlevel microprocessors. We present
An Experimental Evaluation of Ground Decision Procedures
, 2003
"... There is a large variety of algorithms for ground decision procedures, but their differences, in particular in terms of experimental performance, are not well studied. We develop maps of the behavior of ground decision procedures by comparing the performance of a variety of technologies on benchmark ..."
Abstract

Cited by 17 (2 self)
 Add to MetaCart
(Show Context)
There is a large variety of algorithms for ground decision procedures, but their differences, in particular in terms of experimental performance, are not well studied. We develop maps of the behavior of ground decision procedures by comparing the performance of a variety of technologies on benchmark suites with differing characteristics. Based on these...
The Algebra of Equality Proofs
 IN JÜRGEN GIESL, EDITOR, 16TH INTERNATIONAL CONFERENCE ON REWRITING TECHNIQUES AND APPLICATIONS
, 2005
"... Proofs of equalities may be built from assumptions using proof rules for reflexivity, symmetry, and transitivity. Reflexivity is an axiom proving x=x for any x; symmetry is a 1premise rule taking a proof of x=y and returning a proof of y=x; and transitivity is a 2premise rule taking proofs of x= ..."
Abstract

Cited by 14 (5 self)
 Add to MetaCart
(Show Context)
Proofs of equalities may be built from assumptions using proof rules for reflexivity, symmetry, and transitivity. Reflexivity is an axiom proving x=x for any x; symmetry is a 1premise rule taking a proof of x=y and returning a proof of y=x; and transitivity is a 2premise rule taking proofs of x=y and y=z, and returning a proof of x=z. Define an equivalence relation to hold between proofs iff they prove a theorem in common. The main theoretical result of the paper is that if all assumptions are independent, this equivalence relation is axiomatized by the standard axioms of group theory: reflexivity is the unit of the group, symmetry is the inverse, and transitivity is the multiplication. Using a standard completion of the group axioms, we obtain a rewrite system which puts equality proofs into canonical form. Proofs in this canonical form use the fewest possible assumptions, and a proof can be canonized in linear time using a simple strategy. This result is applied to obtain a simple extension of the unionfind algorithm for ground equational reasoning which produces minimal proofs. The time complexity of the original unionfind operations is preserved, and minimal proofs are produced in worstcase time O(n log 2 3), where n is the number of expressions being equated. As a second application, the approach is used to achieve significant performance improvements for the CVC cooperating decision procedure.