Results 1  10
of
650,005
A Framework for Defining Logics
 JOURNAL OF THE ASSOCIATION FOR COMPUTING MACHINERY
, 1993
"... The Edinburgh Logical Framework (LF) provides a means to define (or present) logics. It is based on a general treatment of syntax, rules, and proofs by means of a typed calculus with dependent types. Syntax is treated in a style similar to, but more general than, MartinLof's system of ariti ..."
Abstract

Cited by 807 (45 self)
 Add to MetaCart
The Edinburgh Logical Framework (LF) provides a means to define (or present) logics. It is based on a general treatment of syntax, rules, and proofs by means of a typed calculus with dependent types. Syntax is treated in a style similar to, but more general than, MartinLof's system
A Uniform Framework for Collection Types
"... We present a new algebra for complex database objects based on monoids and monoid homomorphisms. The object types supported in our algebra can be any nested composition of collection types, which include lists, sets, multisets, vectors, and matrices. We define a new calculus equivalent to this algeb ..."
Abstract
 Add to MetaCart
We present a new algebra for complex database objects based on monoids and monoid homomorphisms. The object types supported in our algebra can be any nested composition of collection types, which include lists, sets, multisets, vectors, and matrices. We define a new calculus equivalent to this algebra, called monoid comprehensions, that captures operations involving diverse collection types in declarative form. We present a normalization algorithm that reduces any expression in our algebra to a canonical form which, when evaluated, generates very few intermediate data structures. This algorithm generalizes some wellknown algebraic optimization techniques and heuristics used in relational query optimization. In addition, we demonstrate the modeling power of this language by capturing physical storage structures and algorithms, such as merge join, hash join, and partitioned hash join. Finally, we extend this algebra by incorporating objectoriented features, such as object identity, an...
Valgrind: A framework for heavyweight dynamic binary instrumentation
 In Proceedings of the 2007 Programming Language Design and Implementation Conference
, 2007
"... Dynamic binary instrumentation (DBI) frameworks make it easy to build dynamic binary analysis (DBA) tools such as checkers and profilers. Much of the focus on DBI frameworks has been on performance; little attention has been paid to their capabilities. As a result, we believe the potential of DBI ha ..."
Abstract

Cited by 545 (5 self)
 Add to MetaCart
Dynamic binary instrumentation (DBI) frameworks make it easy to build dynamic binary analysis (DBA) tools such as checkers and profilers. Much of the focus on DBI frameworks has been on performance; little attention has been paid to their capabilities. As a result, we believe the potential of DBI
Ptolemy: A Framework for Simulating and Prototyping Heterogeneous Systems
, 1992
"... Ptolemy is an environment for simulation and prototyping of heterogeneous systems. It uses modern objectoriented software technology (C++) to model each subsystem in a natural and efficient manner, and to integrate these subsystems into a whole. Ptolemy encompasses practically all aspects of design ..."
Abstract

Cited by 569 (90 self)
 Add to MetaCart
Ptolemy is an environment for simulation and prototyping of heterogeneous systems. It uses modern objectoriented software technology (C++) to model each subsystem in a natural and efficient manner, and to integrate these subsystems into a whole. Ptolemy encompasses practically all aspects of designing signal processing and communications systems, ranging from algorithms and communication strategies, simulation, hardware and software design, parallel computing, and generating realtime prototypes. To accommodate this breadth, Ptolemy must support a plethora of widelydiffering design styles. The core of Ptolemy is a set of objectoriented class definitions that makes few assumptions about the system to be modeled; rather, standard interfaces are provided for generic objects and more specialized, applicationspecific objects are derived from these. A basic abstraction in Ptolemy is the Domain, which realizes a computational model appropriate for a particular type of subsystem. Current e...
LLVM: A compilation framework for lifelong program analysis & transformation
, 2004
"... ... a compiler framework designed to support transparent, lifelong program analysis and transformation for arbitrary programs, by providing highlevel information to compiler transformations at compiletime, linktime, runtime, and in idle time between runs. LLVM defines a common, lowlevel code re ..."
Abstract

Cited by 818 (19 self)
 Add to MetaCart
the exception handling features of highlevel languages (and setjmp/longjmp in C) uniformly and efficiently. The LLVM compiler framework and code representation together provide a combination of key capabilities that are important for practical, lifelong analysis and transformation of programs. To our knowledge
A Practical Bayesian Framework for Backprop Networks
 Neural Computation
, 1991
"... A quantitative and practical Bayesian framework is described for learning of mappings in feedforward networks. The framework makes possible: (1) objective comparisons between solutions using alternative network architectures ..."
Abstract

Cited by 496 (20 self)
 Add to MetaCart
A quantitative and practical Bayesian framework is described for learning of mappings in feedforward networks. The framework makes possible: (1) objective comparisons between solutions using alternative network architectures
A Uniform Framework for Ordered Restriction Map Problems
 York University
, 1997
"... Optical Mapping is an emerging technology for constructing ordered restriction maps of DNA molecules [1, 2, 3]. The underlying computational problems for this technology have been studied and several cost functions have been proposed in recent literature. Most of these propose combinatorial model ..."
Abstract

Cited by 4 (3 self)
 Add to MetaCart
models; one of them also presents a probabilistic approach. However, it is not a priori clear as to how these cost functions relate to one another and to the underlying problem. We present a uniform framework for the restriction map problems where each of these various models is a specific instance
A Framework for Uplink Power Control in Cellular Radio Systems
 IEEE Journal on Selected Areas in Communications
, 1996
"... In cellular wireless communication systems, transmitted power is regulated to provide each user an acceptable connection by limiting the interference caused by other users. Several models have been considered including: (1) fixed base station assignment where the assignment of users to base stations ..."
Abstract

Cited by 636 (18 self)
 Add to MetaCart
In cellular wireless communication systems, transmitted power is regulated to provide each user an acceptable connection by limiting the interference caused by other users. Several models have been considered including: (1) fixed base station assignment where the assignment of users to base stations is fixed, (2) minimum power assignment where a user is iteratively assigned to the base station at which its signal to interference ratio is highest, and (3) diversity reception, where a user's signal is combined from several or perhaps all base stations. For the above models, the uplink power control problem can be reduced to finding a vector p of users' transmitter powers satisfying p I(p) where the jth constraint p j I j (p) describes the interference that user j must overcome to achieve an acceptable connection. This work unifies results found for these systems by identifying common properties of the interference constraints. It is also shown that systems in which transmitter powers ...
A uniform framework for predicate abstraction approximation
 in orkshop on Software Verification and Validation(SVV2006),2006
"... Abstract. Abstraction refinement is a powerful technique that enables the verification of real systems. An initial coarse abstraction is provided and iteratively refined until either the property is proved to be true or false. Computing a precise abstraction is usually very expensive. Thus, many tec ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
based approximation. When such approximations are employed, adding new predicates is no more sufficient to rule out all spurious counterexamples. Standard model checkers add new contraints to the transition relations in order to refine the approximation. We propose a uniform framework for describing most of known
A Uniform Framework for Deductive Database Derivation Strategies
, 2004
"... A uniform framework is presented to describe the most typical strategies that are used to compute answers to Deductive Databases. The framework is based on the definition of a general Least Fixpoint operator that operates on meta rules. Each set of meta rules represents a different strategy, and thi ..."
Abstract
 Add to MetaCart
A uniform framework is presented to describe the most typical strategies that are used to compute answers to Deductive Databases. The framework is based on the definition of a general Least Fixpoint operator that operates on meta rules. Each set of meta rules represents a different strategy
Results 1  10
of
650,005