Results 1  10
of
15
Probabilistic relational verification for cryptographic implementations,” Unpublished manuscript
, 2013
"... Relational program logics have been used for mechanizing formal proofs of various cryptographic constructions. With an eye towards scaling these successes towards endtoend security proofs for implementations of distributed systems, we present RF⋆, a relational extension of F⋆, a generalpurpose ..."
Abstract

Cited by 8 (1 self)
 Add to MetaCart
(Show Context)
Relational program logics have been used for mechanizing formal proofs of various cryptographic constructions. With an eye towards scaling these successes towards endtoend security proofs for implementations of distributed systems, we present RF⋆, a relational extension of F⋆, a generalpurpose higherorder stateful programming language with a verification system based on refinement types. The distinguishing feature of RF ⋆ is a relational Hoare logic for a higherorder, stateful, probabilistic language. Through careful language design, we adapt the F ⋆ typechecker to generate both classic and relational verification conditions, and to automatically discharge their proofs using an SMT solver. Thus, we are able to benefit from the existing features of F⋆, including its abstraction facilities for modular reasoning about program fragments. We evaluate RF ⋆ experimentally by programming a series of cryptographic constructions and protocols, and by verifying their security properties, ranging from information flow to unlinkability, integrity, and privacy. Moreover, we validate the design of RF ⋆ by formalizing in Coq a core probabilistic λcalculus and a relational refinement type system and proving the soundness of the latter against a denotational semantics of the probabilistic λcalculus.
Stochastic Optimization of FloatingPoint Programs with Tunable Precision
"... The aggressive optimization of floatingpoint computations is an important problem in highperformance computing. Unfortunately, floatingpoint instruction sets have complicated semantics that often force compilers to preserve programs as written. We present a method that treats floatingpoint opti ..."
Abstract

Cited by 5 (1 self)
 Add to MetaCart
(Show Context)
The aggressive optimization of floatingpoint computations is an important problem in highperformance computing. Unfortunately, floatingpoint instruction sets have complicated semantics that often force compilers to preserve programs as written. We present a method that treats floatingpoint optimization as a stochastic search problem. We demonstrate the ability to generate reduced precision implementations of Intel’s handwritten C numeric library which are up to 6 times faster than the original code, and achieve endtoend speedups of over 30 % on a direct numeric simulation and a ray tracer by optimizing kernels that can tolerate a loss of precision while still remaining correct. Because these optimizations are mostly not amenable to formal verification using the current state of the art, we present a stochastic search technique for characterizing maximum error. The technique comes with an asymptotic guarantee and provides strong evidence of correctness.
Robustness Analysis of Finite Precision Implementations
"... Abstract. A desirable property of control systems is robustness to inputs, when small perturbations of the inputs of a system will cause only small perturbations on outputs. This property should be maintained at the implementation level, where close inputs can lead to different execution paths. The ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
(Show Context)
Abstract. A desirable property of control systems is robustness to inputs, when small perturbations of the inputs of a system will cause only small perturbations on outputs. This property should be maintained at the implementation level, where close inputs can lead to different execution paths. The problem becomes crucial for finite precision implementations, where any elementary computation is affected by an error. In this context, almost every test is potentially unstable, that is, for a given input, the finite precision and real numbers paths may differ. Still, stateoftheart error analyses rely on the stable test hypothesis, yielding unsound error bounds when the conditional block is not robust to uncertainties. We propose a new abstractinterpretation based error analysis of finite precision implementations, which is sound in presence of unstable tests, by bounding the discontinuity error for path divergences. This gives a tractable analysis implemented in the FLUCTUAT analyzer. 1
S.: Robustness Analysis of Networked Systems
 In: Verification, Model Checking, and Abstract Interpretation (VMCAI
, 2013
"... Abstract. Many software systems are naturally modeled as networks of interacting elements such as computing nodes, input devices, and output devices. In this paper, we present a notion of robustness for a networked system when the underlying network is prone to errors. We model such a system N as a ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
Abstract. Many software systems are naturally modeled as networks of interacting elements such as computing nodes, input devices, and output devices. In this paper, we present a notion of robustness for a networked system when the underlying network is prone to errors. We model such a system N as a set of processes that communicate with each other over a set of internal channels, and interact with the outside world through a fixed set of input and output channels. We focus on network errors that arise from channel perturbations, and assume that we are given a worstcase bound δ on the number of errors that can occur in the internal channels of N. We say that the system N is (δ, )robust if the deviation of the output of the perturbed system from the output of the unperturbed system is bounded by . We study a specific instance of this problem when each process is a Mealy machine, and the distance metric used to quantify the deviation from the desired output is either the L1norm or the Levenshtein distance (also known as the edit distance). We present efficient decision procedures for (δ, )robustness for both distance metrics. Our solution draws upon techniques from automata theory, essentially reducing the problem of checking (δ, )robustness to the problem of checking emptiness for a certain class of reversalbounded counter automata. 1
Regular Real Analysis
"... Abstract—We initiate the study of regular real analysis, or the analysis of real functions that can be encoded by automata on infinite words. It is known that ωautomata can be used to represent relations between real vectors, reals being represented in exact precision as infinite streams. The regul ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
(Show Context)
Abstract—We initiate the study of regular real analysis, or the analysis of real functions that can be encoded by automata on infinite words. It is known that ωautomata can be used to represent relations between real vectors, reals being represented in exact precision as infinite streams. The regular functions studied here constitute the functional subset of such relations. We show that some classic questions in function analysis can become elegantly computable in the context of regular real analysis. Specifically, we present an automatatheoretic technique for reasoning about limit behaviors of regular functions, and obtain, using this method, a decision procedure to verify the continuity of a regular function. Several other decision procedures for regular functions—for finding roots, fixpoints, minima, etc.—are also presented. At the same time, we show that the class of regular functions is quite rich, and includes functions that are highly challenging to encode using traditional symbolic notation. I.
Robustness Analysis of String Transducers?
"... Abstract. Many important functions over strings can be represented as finitestate string transducers. In this paper, we present an automatatheoretic technique for algorithmically verifying that such a function is robust to uncertainty. A function encoded as a transducer is defined to be robust if ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
Abstract. Many important functions over strings can be represented as finitestate string transducers. In this paper, we present an automatatheoretic technique for algorithmically verifying that such a function is robust to uncertainty. A function encoded as a transducer is defined to be robust if for each small (i.e., bounded) change to any input string, the change in the transducer’s output is proportional to the change in the input. Changes to input and output strings are quantified using weighted generalizations of the Levenshtein and Manhattan distances over strings. Our main technical contribution is a set of decision procedures based on reducing the problem of robustness verification of a transducer to the problem of checking the emptiness of a reversalbounded counter machine. The decision procedures under the generalized Manhattan and Levenshtein distance metrics are in Pspace and Expspace, respectively. For transducers that are Mealy machines, the decision procedures under these metrics are in Nlogspace and Pspace, respectively. 1
Quantitative timed simulation functions and refinement metrics for realtime systems
 In HSCC. ACM
, 2013
"... ar ..."
(Show Context)
Consistency Analysis of DecisionMaking Programs ∗
"... Applications in many areas of computing make discrete decisions under uncertainty, for reasons such as limited numerical precision in calculations and errors in sensorderived inputs. As a result, individual decisions made by such programs may be nondeterministic, and lead to contradictory decision ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
(Show Context)
Applications in many areas of computing make discrete decisions under uncertainty, for reasons such as limited numerical precision in calculations and errors in sensorderived inputs. As a result, individual decisions made by such programs may be nondeterministic, and lead to contradictory decisions at different points of an execution. This means that an otherwise correct program may execute along paths, that it would not follow under its ideal semantics, violating essential program invariants on the way. A program is said to be consistent if it does not suffer from this problem despite uncertainty in decisions. In this paper, we present a sound, automatic program analysis for verifying that a program is consistent in this sense. Our analysis proves that each decision made along a program execution is consistent with the decisions made earlier in the execution. The proof
A Taste of Sound Reasoning in Faust
"... We address the question of what software verification can do for the audio community by showcasing some preliminary design ideas and tools for a new framework dedicated to the formal reasoning about Faust programs. We use as a foundation one of the strongest current proof assistants, namely Coq com ..."
Abstract
 Add to MetaCart
(Show Context)
We address the question of what software verification can do for the audio community by showcasing some preliminary design ideas and tools for a new framework dedicated to the formal reasoning about Faust programs. We use as a foundation one of the strongest current proof assistants, namely Coq combined with SSReflect. We illustrate the practical impact of our approach via a use case, namely the proof that the implementation of a simple lowpass filter written in the Faust audio programming language indeed meets one of its specification properties. The paper thus serves three purposes: (1) to provide a gentle introduction to the use of formal tools to the audio community, (2) to put forward programming and formal reasoning paradigms we think are well suited to the audio domain and (3) to illustrate this approach on a simple yet practical audio signal processing example, a lowpass filter.
Introduction to Automatic Software Repair
, 2015
"... This document presents an introduction to automatic software repair. It has been first ..."
Abstract
 Add to MetaCart
(Show Context)
This document presents an introduction to automatic software repair. It has been first