Results 1  10
of
13
MONA: Monadic SecondOrder Logic in Practice
 IN PRACTICE, IN TOOLS AND ALGORITHMS FOR THE CONSTRUCTION AND ANALYSIS OF SYSTEMS, FIRST INTERNATIONAL WORKSHOP, TACAS '95, LNCS 1019
, 1995
"... The purpose of this article is to introduce Monadic Secondorder Logic as a practical means of specifying regularity. The logic is a highly succinct alternative to the use of regular expressions. We have built a tool MONA, which acts as a decision procedure and as a translator to finitestate au ..."
Abstract

Cited by 149 (20 self)
 Add to MetaCart
(Show Context)
The purpose of this article is to introduce Monadic Secondorder Logic as a practical means of specifying regularity. The logic is a highly succinct alternative to the use of regular expressions. We have built a tool MONA, which acts as a decision procedure and as a translator to finitestate automata. The tool is based on new algorithms for minimizing finitestate automata that use binary decision diagrams (BDDs) to represent transition functions in compressed form. A byproduct of this work is a new bottomup algorithm to reduce BDDs in linear time without hashing. The potential
RealTime Simulation of A Set Machine on a RAM
 Proceedings on ICCI 89, May 1989, also in Computing and Information
, 1989
"... ..."
ON THE VARIANCE OF THE EXTERNAL PATH LENGTH IN A SYMMETRIC DIGITAL TRlE
, 1989
"... In this paper we give exact and asymptotic analysis for variance of the external path length in a symmetric digital trie. This problem was open up to now. We prove that for the binary symmetric trie the variance is asymptotically equal to 4.35.. n + nf(Iog2 n) where n is the number of stored record ..."
Abstract

Cited by 19 (5 self)
 Add to MetaCart
In this paper we give exact and asymptotic analysis for variance of the external path length in a symmetric digital trie. This problem was open up to now. We prove that for the binary symmetric trie the variance is asymptotically equal to 4.35.. n + nf(Iog2 n) where n is the number of stored records and f(x) is a periodic function with a very small amplitude.
Efficient translation of external input in a dynamically typed language. IFIP 94 proceedings
, 1994
"... New algorithms are given to compile external data in string form into data structures for high level datatypes. Let I be a language of external constants formed from atomic constants and from set, multiset, and tuple constructors. We show how to read an input string C, decide whether it belongs to I ..."
Abstract

Cited by 17 (4 self)
 Add to MetaCart
(Show Context)
New algorithms are given to compile external data in string form into data structures for high level datatypes. Let I be a language of external constants formed from atomic constants and from set, multiset, and tuple constructors. We show how to read an input string C, decide whether it belongs to I, convert it to internal form, and build initial data structures storing the internal value of C in linear worst case time with respect to the number of symbols in C. The algorithm does not require hashing or address arithmetic, but relies only on list processing. A principal subproblem is to detect and remove duplicate elements from setvalued input. To solve this subproblem we extend the technique of multiset discrimination [2, 5] to detect all duplicate elements of a multiset, where these elements may themselves be tuples, multisets, or sets with arbitrary degree of nesting. To handle the case where the elements are multisets, we introduce a new technique called weak sorting, which sorts all of these multisets uniformly according to an arbitrary total order computed by the algorithm. The cost of computing this total order and of sorting all of the multisets is linear in the sum of the number of elements in each of the multisets. Our algorithms are based on a sequential pointer RAM model of computation [4, 7], which accesses and stores data using pointers but disallows address arithmetic (which precludes direct access into arrays). This improves on previous algorithms used to solve the related reading problem in SETL [3, 6]. Those algorithms used hashing even for deeply nested data to detect duplicate values. If we assume that hashing unitspace data takes unit expected time and linear worst case time, then for arbitrary data their algorithm would require linear expected time and quadratic worst case time in the number of symbols in C.
An n log n Algorithm for Online BDD Refinement
 Computer Aided Verification, CAV '97, volume 1254 of LNCS
, 1995
"... Binary Decision Diagrams are in widespread use in verification systems for the canonical representation of Boolean functions. A BDD representing a function ' : B ! N can easily be reduced to its canonical form in linear time. ..."
Abstract

Cited by 13 (3 self)
 Add to MetaCart
(Show Context)
Binary Decision Diagrams are in widespread use in verification systems for the canonical representation of Boolean functions. A BDD representing a function ' : B ! N can easily be reduced to its canonical form in linear time.
High level reading and data structure compilation
 In Proc. 24th ACM SIGPLANSIGACT Symp. on Principles of Prog. Lang
, 1997
"... ..."
(Show Context)
Multiset Discrimination  a Method for Implementing Programming Language Systems Without Hashing
"... It is generally assumed that hashing is essential to many algorithms related to efficient compilation; e.g., symbol table formation and maintenance, grammar manipulation, basic block optimization, and global optimization. This paper questions this assumption, and initiates development of an effic ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
(Show Context)
It is generally assumed that hashing is essential to many algorithms related to efficient compilation; e.g., symbol table formation and maintenance, grammar manipulation, basic block optimization, and global optimization. This paper questions this assumption, and initiates development of an efficient alternative compiler methodology without hashing or sorting. Underlying this methodology are several generic algorithmic tools, among which special importance is given to Multiset Discrimination, which partitions a multiset into blocks of duplicate elements. We show how multiset discrimination, together with other tools, can be tailored to rid compilation of hashing without loss in asymptotic performance. Because of the simplicity of these tools, our results may be of practical as well as theoretical interest. The various applications presented culminate with a new algorithm to solve iterated strength reduction folded with useless code elimination that runs in worst case asympto...
PATRICIA TRIES AGAIN REVISITED
, 1986
"... This paper studies the average complexity of Patricia tries from the successful and unsuccessful search point of view. It is assumed that the Patricia trie is buill over a Velemcnt alphabet, and keys are strings of elements from Ihe alphabet The occurrence of Lbe i t.h element from lhe alphabet in ..."
Abstract
 Add to MetaCart
(Show Context)
This paper studies the average complexity of Patricia tries from the successful and unsuccessful search point of view. It is assumed that the Patricia trie is buill over a Velemcnt alphabet, and keys are strings of elements from Ihe alphabet The occurrence of Lbe i t.h element from lhe alphabet in a key is given by a probabilityPi. i = I, 2,... • V. We also assume that n keys are slored in the Pabicia trie. These assumptions determine the so called Bernoulli model. Let S " and Un denote the successful search and Wlsuccessful search in the Patricia, respeclivcly. We prove that the mlb momcpt of the successful search, E(Sn)m, satisfies lim E (S,,)m Ilnmn = lIhi. where h 1 = I; Pi lnpj. In particular, we show that the vari11+00 i=l anre of Sri is var S " = c In n + 0(1) (c is a constant dependent on Pi. i = I, 2,..., V) Cor an asymmetric Patricia. and var S " = 0 (1) for a symmetric Patricia (e.g. if V = 2 var Sn = 1.00). The unsuccessful search Un ' is sludied only for binary symmetric Patricia tries. We prove lhat lim E (un)m Ilg mn = 1. In particular. the variance of Un. is given by var Un = 0.8790. 1.