Results 1  10
of
11
libalf:the automata learning framework
 In CAV, LNCS 6174
, 2010
"... Abstract. This paper presents libalf, a comprehensive, opensource library for learning formal languages. libalf covers various wellknown learning techniques for finite automata (e.g. Angluin’s L ∗ , Biermann, RPNI etc.) as well as novel learning algorithms (such as for NFA and visibly onecounter ..."
Abstract

Cited by 11 (5 self)
 Add to MetaCart
(Show Context)
Abstract. This paper presents libalf, a comprehensive, opensource library for learning formal languages. libalf covers various wellknown learning techniques for finite automata (e.g. Angluin’s L ∗ , Biermann, RPNI etc.) as well as novel learning algorithms (such as for NFA and visibly onecounter automata). libalf is flexible and allows facilely interchanging learning algorithms and combining domainspecific features in a plugandplay fashion. Its modular design and C++ implementation make it a suitable platform for adding and engineering further learning algorithms for new target models (e.g., Büchi automata). 1
Automated Learning of Probabilistic Assumptions for Compositional Reasoning
"... Abstract. Probabilistic verification techniques have been applied to the formal modelling and analysis of a wide range of systems, from communication protocols such as Bluetooth, to nanoscale computing devices, to biological cellular processes. In order to tackle the inherent challenge of scalabilit ..."
Abstract

Cited by 7 (3 self)
 Add to MetaCart
(Show Context)
Abstract. Probabilistic verification techniques have been applied to the formal modelling and analysis of a wide range of systems, from communication protocols such as Bluetooth, to nanoscale computing devices, to biological cellular processes. In order to tackle the inherent challenge of scalability, compositional approaches to verification are sorely needed. An example is assumeguarantee reasoning, where each component of a system is analysed independently, using assumptions about the other components that it interacts with. We discuss recent developments in the area of automated compositional verification techniques for probabilistic systems. In particular, we describe techniques to automatically generate probabilistic assumptions that can be used as the basis for compositional reasoning. We do so using algorithmic learning techniques, which have already proved to be successful for the generation of assumptions for compositional verification of nonprobabilistic systems. We also present recent improvements and extensions to this work and survey some of the promising potential directions for further research in this area. 1
Learning Nondeterministic Mealy Machines
"... In applications where abstract models of reactive systems are to be inferred, one important challenge is that the behavior of such systems can be inherently nondeterministic. To cope with this challenge, we developed an algorithm to infer nondeterministic computation models in the form of Mealy mac ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
In applications where abstract models of reactive systems are to be inferred, one important challenge is that the behavior of such systems can be inherently nondeterministic. To cope with this challenge, we developed an algorithm to infer nondeterministic computation models in the form of Mealy machines. We introduce our approach and provide extensive experimental results to assess its potential in the identification of blackbox reactive systems. The experiments involve both artificiallygenerated abstract Mealy machines, and the identification of a TFTP server model starting from a publiclyavailable implementation. 1.
Languages, Theory
"... Program Synthesis, which is the task of discovering programs that realize user intent, can be useful in several scenarios: enabling people with no programming background to develop utility programs, helping regular programmers automatically discover tricky/mundane details, program understanding, dis ..."
Abstract
 Add to MetaCart
Program Synthesis, which is the task of discovering programs that realize user intent, can be useful in several scenarios: enabling people with no programming background to develop utility programs, helping regular programmers automatically discover tricky/mundane details, program understanding, discovery of new algorithms, and even teaching. This paper describes three key dimensions in program synthesis: expression of user intent, space of programs over which to search, and the search technique. These concepts are illustrated by brief description of various program synthesis projects that target synthesis of a wide variety of programs such as standard undergraduate textbook algorithms (e.g., sorting, dynamic programming), program inverses (e.g., decoders, deserializers), bitvector manipulation routines, deobfuscated programs, graph algorithms, textmanipulating routines, mutual exclusion algorithms, etc. Categories and Subject Descriptors D.1.2 [Programming Techniques]:
Noname manuscript No. (will be inserted by the editor) Stories About Calculations: Remembering Peter Landin
"... Abstract This article recalls Peter Landin as a PhD supervisor and introduces his final lecture notes. Peter Landin was my PhD supervisor. I registered in 1988 for a parttime PhD while working for Marconi and met with Peter on Friday afternoons about once every couple of months from 88 to 93 and th ..."
Abstract
 Add to MetaCart
Abstract This article recalls Peter Landin as a PhD supervisor and introduces his final lecture notes. Peter Landin was my PhD supervisor. I registered in 1988 for a parttime PhD while working for Marconi and met with Peter on Friday afternoons about once every couple of months from 88 to 93 and then sporadically until my graduation in 96. I had a rather idealized notion of Academia and Peter did not disappoint in terms of a dazzling performance at the chalkface, together with a clear and personal approach to the subject and its history (including gossipy asides). We would meet in Peter’s room after lunch at Queen Mary College, University of London, which in those days was designated (by Peter) smokefree, in the sense that the smokes were ‘free’; we discussed atlength a subject of Peter’s choosing and would typically repair to the bar, which amazingly was at the end of the Computer Science corridor, where Peter would continue the technical discussions. When I took up my first academic post at the University of Bradford in 94 I was dismayed to discover that not all departments have bars at the end of the corridor (and not all departments have people like Peter). Peter had a consuming interest in Programming Languages and was both active and productive during the years I knew him. His approach is best described by John Reynolds: ‘Peter Landin remarked long ago that the goal of his research was to tell beautiful stories about computation ’ Reynolds (1999). Furthermore, Peter aimed for precision, to be contrasted with formality, in story telling. In research and pedagogy Peter was interested in ways of communication and representation, informed by the essence of a computational issue; to articulate precisely was the key. Peter is perhaps best known for discovering a connection and inventing a machine. The former, a correspondence between λcalculi and programming languages, has had farreaching consequences, the latter less so, possibly due to misunderstanding of the intended use of the SECD machine and due an emergence of a more ‘abstract ’ approach in terms of semantic models based on relations.
Team MExICo Modelling and Exploitation of Interaction and Concurrency
"... c t i v it y e p o r t 2009 Table of contents ..."
On the use of nondeterministic automata for Presburger Arithmetic
"... Abstract. A wellknown decision procedure for Presburger arithmetic uses deterministic finitestate automata. While the complexity of the decision procedure for Presburger arithmetic based on quantifier elimination is known (roughly, there is a doubleexponential nondeterministic time lower bound ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract. A wellknown decision procedure for Presburger arithmetic uses deterministic finitestate automata. While the complexity of the decision procedure for Presburger arithmetic based on quantifier elimination is known (roughly, there is a doubleexponential nondeterministic time lower bound and a triple exponential deterministic time upper bound), the exact complexity of the automatabased procedure was unknown. We show in this paper that it is tripleexponential as well by analysing the structure of the nondeterministic automata obtained during the construction. Furthermore, we analyse the sizes of deterministic and nondeterministic automata built for several subclasses of Presburger arithmetic such as disjunctions and conjunctions of atomic formulas. To retain a canonical representation which is one of the strengths of the use of automata we use residual finitestate automata, a subclass of nondeterministic automata. 1
about Programs
"... Program Synthesis, which is the task of discovering programs that realize user intent, can be useful in several scenarios: enabling people with no programming background to develop utility programs, helping regular programmers automatically discover tricky/mundane details, program understanding, dis ..."
Abstract
 Add to MetaCart
Program Synthesis, which is the task of discovering programs that realize user intent, can be useful in several scenarios: enabling people with no programming background to develop utility programs, helping regular programmers automatically discover tricky/mundane details, program understanding, discovery of new algorithms, and even teaching. This paper describes three key dimensions in program synthesis: expression of user intent, space of programs over which to search, and the search technique. These concepts are illustrated by brief description of various program synthesis projects that target synthesis of a wide variety of programs such as standard undergraduate textbook algorithms (e.g., sorting, dynamic programming), program inverses (e.g., decoders, deserializers), bitvector manipulation routines, deobfuscated programs, graph algorithms, textmanipulating routines, mutual exclusion algorithms, etc.
Second assessor
, 2014
"... Automata learning is increasingly being applied to ease the testing and comparing of complex systems. We formally reconstruct an efficient algorithm for the inference of Mealy machines, prove its correctness, and show that equivalence queries are not required for nonminimal hypotheses. In fact, we a ..."
Abstract
 Add to MetaCart
(Show Context)
Automata learning is increasingly being applied to ease the testing and comparing of complex systems. We formally reconstruct an efficient algorithm for the inference of Mealy machines, prove its correctness, and show that equivalence queries are not required for nonminimal hypotheses. In fact, we are able to evade those by applying a minor optimization to the algorithm. As a corollary, standard conformance testing methods can be used directly for equivalence query Model checking plays a central role in the verification of the correctness of software. The main condition required to apply this technique is the availability of a model of the system. In the form of an automaton, this model can be intersected with other automata describing illegal behavior in
Learning Residual FiniteState Automata Using Observation Tables
"... We define a twostep learner for RFSAs based on an observation table by using an algorithm for minimal DFAs to build a table for the reversal of the language in question and showing that we can derive the minimal RFSA from it after some simple modifications. We compare the algorithm to two other tab ..."
Abstract
 Add to MetaCart
(Show Context)
We define a twostep learner for RFSAs based on an observation table by using an algorithm for minimal DFAs to build a table for the reversal of the language in question and showing that we can derive the minimal RFSA from it after some simple modifications. We compare the algorithm to two other tablebased ones of which one (by Bollig et al. [8]) infers a RFSA directly, and the other is another twostep learner proposed by the author. We focus on the criterion of query complexity.