• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Advanced Search Include Citations
Advanced Search Include Citations | Disambiguate

Sufficient preconditions for modular assertion checking. (2008)

by Y Moy
Venue:In VMCAI,
Add To MetaCart

Tools

Sorted by:
Results 1 - 10 of 15
Next 10 →

Compositional Shape Analysis by means of Bi-Abduction

by Cristiano Calcagno, Dino Distefano, Peter O'Hearn, Hongseok Yang , 2009
"... This paper describes a compositional shape analysis, where each procedure is analyzed independently of its callers. The analysis uses an abstract domain based on a restricted fragment of separation logic, and assigns a collection of Hoare triples to each procedure; the triples provide an over-approx ..."
Abstract - Cited by 143 (16 self) - Add to MetaCart
This paper describes a compositional shape analysis, where each procedure is analyzed independently of its callers. The analysis uses an abstract domain based on a restricted fragment of separation logic, and assigns a collection of Hoare triples to each procedure; the triples provide an over-approximation of data structure usage. Compositionality brings its usual benefits – increased potential to scale, ability to deal with unknown calling contexts, graceful way to deal with imprecision – to shape analysis, for the first time. The analysis rests on a generalized form of abduction (inference of explanatory hypotheses) which we call bi-abduction. Biabduction displays abduction as a kind of inverse to the frame problem: it jointly infers anti-frames (missing portions of state) and frames (portions of state not touched by an operation), and is the basis of a new interprocedural analysis algorithm. We have implemented

Quantifier Elimination by Lazy Model Enumeration

by David Monniaux - of Lecture Notes in Computer Science , 2010
"... Abstract We propose a quantifier elimination scheme based on nested lazy model enumeration through SMT-solving, and projections. This scheme may be applied to any logic that fulfills certain conditions; we illustrate it for linear real arithmetic. The quantifier elimination problem for linear real ..."
Abstract - Cited by 22 (3 self) - Add to MetaCart
Abstract We propose a quantifier elimination scheme based on nested lazy model enumeration through SMT-solving, and projections. This scheme may be applied to any logic that fulfills certain conditions; we illustrate it for linear real arithmetic. The quantifier elimination problem for linear real arithmetic is doubly exponential in the worst case, and so is our method. We have implemented it and benchmarked it against other methods from the literature.
(Show Context)

Citation Context

...s in transforming a quantified formula into an equivalent quantifier-free formula. For instance, the formulas ∀y (y − z ≥ x ⇒ x + z ≥ 1) and x ≥ 1 − z are equivalent (they have the same models for (x, z)), whether considered over the reals or integers. Quantifier elimination subsumes both satisfiability testing for quantifier-free formulas, and the decision of quantified formulas without free variables. In program analysis, quantifier elimination has been applied to obtain optimal invariants and optimal abstract transformers [22, 21], and to obtain preconditions for modular assertion checking [23]. Unfortunately, quantifier elimination tends to be slow; as recalled in §4, worst-case complexities for useful theories tend to be towers of exponentials. Yet, high worst-case complexity does not preclude exploring procedures that perform fast on most examples, as shown by the high success of SAT solving. This motivates our work on new quantifier elimination algorithms. Many interesting mathematical theories admit quantifier elimination. In order to introduce better elimination schemes, we shall first describe a naive, but inefficient algorithm (§2.2) which works by calling a projection opera...

Inferring Min and Max Invariants Using Max-plus Polyhedra

by Xavier Allamigeon, Stéphane Gaubert
"... Abstract. We introduce a new numerical abstract domain able to infer min and max invariants over the program variables, based on max-plus polyhedra. Our abstraction is more precise than octagons, and allows to express non-convex properties without any disjunctive representations. We have defined sou ..."
Abstract - Cited by 20 (9 self) - Add to MetaCart
Abstract. We introduce a new numerical abstract domain able to infer min and max invariants over the program variables, based on max-plus polyhedra. Our abstraction is more precise than octagons, and allows to express non-convex properties without any disjunctive representations. We have defined sound abstract operators, evaluated their complexity, and implemented them in a static analyzer. It is able to automatically compute precise properties on numerical and memory manipulating programs such as algorithms on strings and arrays. 1
(Show Context)

Citation Context

... of our abstraction. For instance, it could be directly integrated in non-disjunctive array predicate abstractions (e.g. [33]), and help to automatically discover preconditions on C library functions =-=[35]-=- without disjunction. Moreover, when analyzing sorting algorithms, in order to prove that the resulting array is correctly ordered (and not infer information over its first and last elements only), on...

Automatic inference of necessary preconditions.

by Patrick Cousot , Radhia Cousot , Manuel Fähndrich , Francesco Logozzo - In VMCAI, , 2013
"... ..."
Abstract - Cited by 10 (3 self) - Add to MetaCart
Abstract not found
(Show Context)

Citation Context

...e statically proven by cccheck [16](or similar tools). We will use the assertions in A to infer necessary preconditions. This consists in propagating these conditions backwards to the origin of the traces of the semantics S, at the entry control point. The inference of termination preconditions is a separate problem [10], so we ignore the non-terminating behaviors I, or equivalently, assume termination i.e. I = ∅. 3 Sufficient Preconditions The weakest (liberal) preconditions provide sufficient preconditions which guarantee the (partial) correctness, i.e., the absence of errors in the program [2,5,9,22,29,31]: 2 public static void Example(object[] a) { Contract.Requires(a != null); for (var i = 0; i <= a.Length; i++) { a[i] = ...f(a[i])... ; // (*) if (NonDet()) return; } } Fig. 1. The weakest precondition for this code is false, which rules out good executions. Our technique only excludes bad runs, inferring the necessary precondition 0 < a.Length. ∀s ∈ Σ : wlp(P, true)(s) , (E(s) = ∅). The main drawbacks preventing the use of the weakest (liberal) preconditions calculus for automatic precondition inference are: (i) in the presence of loops, there is no algorithm that computes weakest (liberal) p...

Weakest precondition synthesis for compiler optimizations

by Nuno P. Lopes, Jose ́ Monteiro - In Proc. of the 15th International Conference on Verification, Model Checking, and Abstract Interpretation , 2014
"... Abstract. Compiler optimizations play an increasingly important role in code generation. This is especially true with the advent of resource-limited mobile devices. We rely on compiler optimizations to improve performance, reduce code size, and reduce power consumption of our programs. Despite being ..."
Abstract - Cited by 4 (3 self) - Add to MetaCart
Abstract. Compiler optimizations play an increasingly important role in code generation. This is especially true with the advent of resource-limited mobile devices. We rely on compiler optimizations to improve performance, reduce code size, and reduce power consumption of our programs. Despite being a mature field, compiler optimizations are still designed and implemented by hand, and usually without providing any guarantee of correctness. In addition to devising the code transformations, designers and imple-menters have to come up with an analysis that determines in which cases the optimization can be safely applied. In other words, the optimization designer has to specify a precondition that ensures that the optimization is semantics-preserving. However, devising preconditions for optimiza-tions by hand is a non-trivial task. It is easy to specify a precondition that, although correct, is too restrictive, and therefore misses some op-timization opportunities. In this paper, we propose, to the best of our knowledge, the first al-gorithm for the automatic synthesis of preconditions for compiler opti-mizations. The synthesized preconditions are provably correct by con-struction, and they are guaranteed to be the weakest in the precondition language that we consider. We implemented the proposed technique in a tool named PSyCO. We present examples of preconditions synthesized by PSyCO, as well as the results of running PSyCO on a set of optimizations. 1
(Show Context)

Citation Context

...r automatic generation. There are several competing approaches for WLP synthesis. These include, for example, precondition templates and constraint solving (e.g., [15]), quantifier elimination (e.g., =-=[27]-=-), abstract interpretation (e.g., [9]), and CEGAR, predicate abstraction, and interpolation for predicate generation (e.g., [33]). Some algorithms combine multiple techniques to achieve better perform...

Compositional Shape Analysis

by Cristiano Calcagno, Dino Distefano, Peter O&apos;Hearn, Hongseok Yang , 2009
"... This paper describes a compositional shape analysis, where each procedure is analyzed independently of its callers. The analysis uses an abstract domain based on a restricted fragment of separation logic, and assigns a collection of Hoare triples to each procedure; the triples provide an over-approx ..."
Abstract - Cited by 2 (2 self) - Add to MetaCart
This paper describes a compositional shape analysis, where each procedure is analyzed independently of its callers. The analysis uses an abstract domain based on a restricted fragment of separation logic, and assigns a collection of Hoare triples to each procedure; the triples provide an over-approximation of data structure usage. Compositionality brings its usual benefits – increased potential to scale, ability to deal with unknown calling contexts, graceful way to deal with imprecision – to shape analysis, for the first time. The analysis rests on a generalized form of abduction (inference of explanatory hypotheses) which we call bi-abduction. Biabduction displays abduction as a kind of inverse to the frame problem: it jointly infers anti-frames (missing portions of state) and frames (portions of state not touched by an operation), and is the basis of a new interprocedural analysis algorithm. We have implemented

Counterexample-guided precondition inference

by Mohamed Nassim Seghir, Daniel Kroening - In ESOP , 2013
"... Abstract. The precondition for an assertion within a procedure is use-ful for understanding, verifying and debugging programs. As the proce-dure might be used in multiple calling-contexts within the program, the precondition should be sufficiently precise to enable re-use. We present an extension of ..."
Abstract - Cited by 2 (0 self) - Add to MetaCart
Abstract. The precondition for an assertion within a procedure is use-ful for understanding, verifying and debugging programs. As the proce-dure might be used in multiple calling-contexts within the program, the precondition should be sufficiently precise to enable re-use. We present an extension of counterexample-guided abstraction refinement (CEGAR) for automated precondition inference. Starting with an overapproxima-tion of both the set of safe and unsafe states, we iteratively refine them until they become disjoint. The resulting precondition is then neces-sary and sufficient for the validity of the assertion, which prevents false alarms. We have implemented our approach and present experimental results using string and array-manipulating programs. 1
(Show Context)

Citation Context

...esents a general scheme for extending the previously mentioned tools to infer preconditions, as it uses ingredients that are common to all of them. Moy has proposed a technique to infer preconditions =-=[23]-=- using a combination of state-of-the-art techniques such as abstract interpretation, weakest preconditions and quantifier elimination. While his technique is stronger than many existing ones, it is un...

Augmenting Counterexample-Guided Abstraction Refinement with Proof Templates

by Thomas E. Hart, Kelvin Ku, Arie Gurfinkel, Marsha Chechik, David Lie
"... Abstract—Existing software model checkers based on predicate abstraction and refinement typically perform poorly at verifying the absence of buffer overflows, with analyses depending on the sizes of the arrays checked. We observe that many of these analyses can be made efficient by providing proof t ..."
Abstract - Cited by 1 (1 self) - Add to MetaCart
Abstract—Existing software model checkers based on predicate abstraction and refinement typically perform poorly at verifying the absence of buffer overflows, with analyses depending on the sizes of the arrays checked. We observe that many of these analyses can be made efficient by providing proof templates for common array traversal idioms idioms, which guide the model checker towards proofs that are independent of array size. We have integrated this technique into our software model checker, PTYASM, and have evaluated our approach on a set of testcases derived from the Verisec suite, demonstrating that our technique enables verification of the safety of array accesses independently of array size. I.
(Show Context)

Citation Context

...on fixed abstractions which cannot be refined at analysis time, thus leading to false alarms. Tools based on deductive verification typically require an inference procedure to provide loop invariants =-=[20]-=-. Denney and Fischer [21] have applied a pattern-based approach which is similar in spirit to ours to deductive verification, but not for the problem of array bounds checking, and their scope is restr...

Verification modulo versions: Towards usable verification

by Francesco Logozzo , Shuvendu K Lahiri , Manuel Fähndrich , Sam Blackshear - in PLDI , 2014
"... Abstract We introduce Verification Modulo Versions (VMV), a new static analysis technique for reducing the number of alarms reported by static verifiers while providing sound semantic guarantees. First, VMV extracts semantic environment conditions from a base program P. Environmental conditions can ..."
Abstract - Cited by 1 (0 self) - Add to MetaCart
Abstract We introduce Verification Modulo Versions (VMV), a new static analysis technique for reducing the number of alarms reported by static verifiers while providing sound semantic guarantees. First, VMV extracts semantic environment conditions from a base program P. Environmental conditions can either be sufficient conditions (implying the safety of P) or necessary conditions (implied by the safety of P). Then, VMV instruments a new version of the program, P , with the inferred conditions. We prove that we can use (i) sufficient conditions to identify abstract regressions of P w.r.t. P; and (ii) necessary conditions to prove the relative correctness of P w.r.t. P. We show that the extraction of environmental conditions can be performed at a hierarchy of abstraction levels (history, state, or call conditions) with each subsequent level requiring a less sophisticated matching of the syntactic changes between P and P. Call conditions are particularly useful because they only require the syntactic matching of entry points and callee names across program versions. We have implemented VMV in a widely used static analysis and verification tool. We report our experience on two large code bases and demonstrate a substantial reduction in alarms while additionally providing relative correctness guarantees.
(Show Context)

Citation Context

... sufficient) condition on the particular API, and then automatically insert in the next version of the program. Several authors addressed the problem of under-approximating sufficient conditions—even if they did not explicitly state it in the terms of this paper. For instance, previous work in specification mining [2] and interface inference [1, 20] under-approximates sufficient history conditions ~S: their goal is to describe safe API usage by inferring sequences of API calls that are sufficient to prevent crashes in library code. Similarly, works on the inference of sufficient preconditions [5, 32] under-approximate our method call sufficient conditions S. In the worst case, such approximations result in the sufficient condition false. Few authors addressed the inference of necessary condition. For instance, Bouaziz et al. [6] used necessary history conditions to infer a sequences of method calls that inevitably lead to a crash, and Cousot et al. [10] used them in the context of modular analysis. In the worst case, such approximations yield the necessary condition true. 10. Conclusion In this paper, we addressed the problem of reducing the number of warnings, yet providing soundness gua...

found at the ENTCS Macro Home Page. Inferring Sufficient Conditions with Backward

by unknown authors , 2012
"... sufficient conditions with backward polyhedral under-approximations ..."
Abstract - Add to MetaCart
sufficient conditions with backward polyhedral under-approximations
(Show Context)

Citation Context

...pre). He also mentions that classic domains, such as intervals, are inadequate to express underapproximations as they are not closed under complementation, but he does not propose an alternative. Moy =-=[15]-=- solves this issue by allowing disjunctions of abstract states (they correspond to path enumerations and can grow arbitrarily large). Lev-Ami et al. [11] derive under-approximations from overapproxima...

Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University