Results 1 - 10
of
21
Infinitary Control Flow Analysis: a Collecting Semantics for Closure Analysis
, 1997
"... Defining the collecting semantics is usually the first crucial step in adapting the general methodology of abstract interpretation to the semantic framework or programming language at hand. In this paper we show how to define a collecting semantics for control flow analysis; due to the generality of ..."
Abstract
-
Cited by 75 (9 self)
- Add to MetaCart
(Show Context)
Defining the collecting semantics is usually the first crucial step in adapting the general methodology of abstract interpretation to the semantic framework or programming language at hand. In this paper we show how to define a collecting semantics for control flow analysis; due to the generality of the formulation we need to appeal to coinduction (or greatest fixed points) in order to define the analysis. We then prove the semantic soundness of the collecting semantics and that all totally deterministic instantiations have a least solution; this incorporates k-CFA, polymorphic splitting and a new class of uniform-k-CFA analyses. 1 Introduction Control flow analysis [16, 17] is known by many names: closure analysis [13, 15], set-based analysis [9] (touching upon other constraint-based analyses [1]), and flow analysis [6]. Although the fine formulational details differ they are all variations over a theme, producing analyses of di#erent precision: 0-CFA [16], k-CFA [16, 10], poly-k-CF...
Type and effect systems
- ACM Computing Surveys
, 1999
"... Abstract. The design and implementation of a correct system can benefit from employing static techniques for ensuring that the dynamic behaviour satisfies the specification. Many programming languages incorporate types for ensuring that certain operations are only applied to data of the appropriate ..."
Abstract
-
Cited by 47 (0 self)
- Add to MetaCart
(Show Context)
Abstract. The design and implementation of a correct system can benefit from employing static techniques for ensuring that the dynamic behaviour satisfies the specification. Many programming languages incorporate types for ensuring that certain operations are only applied to data of the appropriate form. A natural extension of type checking techniques is to enrich the types with annotations and effects that further describe intensional aspects of the dynamic behaviour.
From Polyvariant Flow Information to Intersection and Union Types
- J. FUNCT. PROGRAMMING
, 1998
"... Many polyvariant program analyses have been studied in the 1990s, including k-CFA, polymorphic splitting, and the cartesian product algorithm. The idea of polyvariance is to analyze functions more than once and thereby obtain better precision for each call site. In this paper we present an equivalen ..."
Abstract
-
Cited by 43 (7 self)
- Add to MetaCart
Many polyvariant program analyses have been studied in the 1990s, including k-CFA, polymorphic splitting, and the cartesian product algorithm. The idea of polyvariance is to analyze functions more than once and thereby obtain better precision for each call site. In this paper we present an equivalence theorem which relates a co-inductively defined family of polyvariant ow analyses and a standard type system. The proof embodies a way of understanding polyvariant flow information in terms of union and intersection types, and, conversely, a way of understanding union and intersection types in terms of polyvariant flow information. We use the theorem as basis for a new flow-type system in the spirit of the CIL -calculus of Wells, Dimock, Muller, and Turbak, in which types are annotated with flow information. A flow-type system is useful as an interface between a owanalysis algorithm and a program optimizer. Derived systematically via our equivalence theorem, our flow-type system should be a g...
Abstracting Abstract Machines
, 2010
"... We describe a derivational approach to abstract interpretation that yields novel and transparently sound static analyses when applied to well-established abstract machines. To demonstrate the technique and support our claim, we transform the CEK machine of Felleisen and Friedman, a lazy variant of K ..."
Abstract
-
Cited by 34 (21 self)
- Add to MetaCart
(Show Context)
We describe a derivational approach to abstract interpretation that yields novel and transparently sound static analyses when applied to well-established abstract machines. To demonstrate the technique and support our claim, we transform the CEK machine of Felleisen and Friedman, a lazy variant of Krivine’s machine, and the stack-inspecting CM machine of Clements and Felleisen into abstract interpretations of themselves. The resulting analyses bound temporal ordering of program events; predict return-flow and stack-inspection behavior; and approximate the flow and evaluation of by-need parameters. For all of these machines, we find that a series of well-known concrete machine refactorings, plus a technique we call store-allocated continuations, leads to machines that abstract into static analyses simply by bounding their stores. We demonstrate that the technique scales up uniformly to allow static analysis of realistic language features, including tail calls, conditionals, side effects, exceptions, first-class continuations, and even garbage collection.
Systematic Realisation of Control Flow Analyses for CML
- In Proceedings of ICFP'97
, 1997
"... We present a methodology for the systematic realisation of control flow analyses and illustrate it for Concurrent ML. We start with an abstract specification of the analysis that is next proved semantically sound with respect to a traditional small-step operational semantics; this result holds for t ..."
Abstract
-
Cited by 22 (10 self)
- Add to MetaCart
(Show Context)
We present a methodology for the systematic realisation of control flow analyses and illustrate it for Concurrent ML. We start with an abstract specification of the analysis that is next proved semantically sound with respect to a traditional small-step operational semantics; this result holds for terminating as well as non-terminating programs. The analysis is defined coinductively and it is shown that all programs have a least analysis result (that is indeed the best one). To realise the analysis we massage the specification in three stages: (i) to explicitly record reachability of subexpressions, (ii) to be defined in a syntax-directed manner, and (iii) to generate a set of constraints that subsequently can be solved by standard techniques. We prove equivalence results between the different versions of the analysis; in particular it follows that the least solution to the constraints generated will be the least analysis result also to the initial specification. 1 Introduction Many c...
Constraint abstractions
- In Programs as Data Objects II
, 2001
"... Smoke simulation is a key feature of serious gaming applications for fire-fighting professionals. A perfect visual appearance is not of paramount importance, the behavior of the smoke must however closely resemble its natural counterpart for successful adoption of the application. We therefore sugge ..."
Abstract
-
Cited by 18 (2 self)
- Add to MetaCart
Smoke simulation is a key feature of serious gaming applications for fire-fighting professionals. A perfect visual appearance is not of paramount importance, the behavior of the smoke must however closely resemble its natural counterpart for successful adoption of the application. We therefore suggest a hybrid grid/particle based architecture for smoke simulation that uses a cheap multi-sampling technique for controlling smoke behavior. This approach is simple enough for it to be implemented in current generation game engines, and uses techniques that are very suitable for GPU implementation, thus enabling the use of hardware acceleration for the smoke simulation.
A Usage Analysis With Bounded Usage Polymorphism and Subtyping
- IN PROCEEDINGS OF THE 12TH INTERNATIONAL WORKSHOP ON IMPLEMENTATION OF FUNCTIONAL LANGUAGES, NUMBER AIB-00-7 IN AACHENER INFORMATIK BERICHTE
, 2000
"... Previously proposed usage analyses have proved not to scale up well for large programs. In this paper we present a powerful and accurate type based analysis designed to scale up for large programs. The key features of the type system are usage subtyping and bounded usage polymorphism. Bounded polymo ..."
Abstract
-
Cited by 14 (3 self)
- Add to MetaCart
(Show Context)
Previously proposed usage analyses have proved not to scale up well for large programs. In this paper we present a powerful and accurate type based analysis designed to scale up for large programs. The key features of the type system are usage subtyping and bounded usage polymorphism. Bounded polymorphism can lead to huge constraint sets and to express constraints compactly we introduce a new expressive form of constraints which allows constraints to be represented compactly through calls to constraint abstractions.
Simple Usage Polymorphism
- TIC 2000
, 2000
"... We present a novel inference algorithm for a type system featuring subtyping and usage (annotation) polymorphism. This algorithm infers simply-polymorphic types rather than the constrained-polymorphic types usual in such a setting; it achieves this by means of constraint approximation. The algorithm ..."
Abstract
-
Cited by 12 (0 self)
- Add to MetaCart
We present a novel inference algorithm for a type system featuring subtyping and usage (annotation) polymorphism. This algorithm infers simply-polymorphic types rather than the constrained-polymorphic types usual in such a setting; it achieves this by means of constraint approximation. The algorithm is motivated by practical considerations and experience of a previous system, and has been implemented in a production compiler with positive results. We believe the algorithm may well have applications in settings other than usage-type inference.
Cheap Eagerness: Speculative Evaluation in a Lazy Functional Language
, 2000
"... Cheap eagerness is an optimization where cheap and safe expressions are evaluated before it is known that their values are needed. Many compilers for lazy functional languages implement this optimization, but they are limited by a lack of information about the global flow of control and about which ..."
Abstract
-
Cited by 8 (2 self)
- Add to MetaCart
Cheap eagerness is an optimization where cheap and safe expressions are evaluated before it is known that their values are needed. Many compilers for lazy functional languages implement this optimization, but they are limited by a lack of information about the global flow of control and about which variables are already evaluated. Without this information, even a variable reference is a potentially unsafe expression! In this paper we show that significant speedups are achievable by cheap eagerness. Our cheapness analysis uses the results of a program-wide data and control flow analysis to find out which variables may be unevaluated and which variables may be bound to functions which are dangerous to call. 1.
Control Flow Analysis of UML 2.0 Sequence Diagrams
- PROCEEDINGS OF THE EUROPEAN CONFERENCE ON MODEL DRIVEN ARCHITECTURE-FOUNDATIONS AND APPLICATIONS (ECMDA-FA
, 2005
"... This article presents a control flow analysis methodology based on UML 2.0 sequence diagrams (SD). In contrast to the conventional code-based control flow analysis techniques, this technique can be used earlier in software development life cycle, when the UML design model of a system becomes availab ..."
Abstract
-
Cited by 7 (4 self)
- Add to MetaCart
This article presents a control flow analysis methodology based on UML 2.0 sequence diagrams (SD). In contrast to the conventional code-based control flow analysis techniques, this technique can be used earlier in software development life cycle, when the UML design model of a system becomes available. Among many applications, this technique can be used in SD-based test techniques, model comprehension and model execution in the context of MDA. Based on the well-defined UML 2.0 activity diagrams, we propose an extended activity diagram metamodel, referred to as Concurrent Control Flow Graph (CCFG), to support control flow analysis of UML 2.0 sequence diagrams. Our strategy in this article is to define an OCL-based mapping in a formal and verifiable form as consistency rules between a SD and a CCFG, so as to ensure the completeness of the rules and the CCFG metamodel with respect to our control flow analysis purpose and allow their verification. Completeness here means if the