Results 11  20
of
1,572,319
FlowMap: An Optimal Technology Mapping Algorithm for Delay Optimization in LookupTable Based FPGA Designs
 IEEE TRANS. CAD
, 1994
"... The field programmable gatearray (FPGA) has become an important technology in VLSI ASIC designs. In the past a few years, a number of heuristic algorithms have been proposed for technology mapping in lookuptable (LUT) based FPGA designs, but none of them guarantees optimal solutions for general Bo ..."
Abstract

Cited by 317 (41 self)
 Add to MetaCart
The field programmable gatearray (FPGA) has become an important technology in VLSI ASIC designs. In the past a few years, a number of heuristic algorithms have been proposed for technology mapping in lookuptable (LUT) based FPGA designs, but none of them guarantees optimal solutions for general Boolean networks and little is known about how far their solutions are away from the optimal ones. This paper presents a theoretical breakthrough which shows that the LUTbased FPGA technology mapping problem for depth minimization can be solved optimally in polynomial time. A key step in our algorithm is to compute a minimum height Kfeasible cut in a network, which is solved optimally in polynomial time based on network flow computation. Our algorithm also effectively minimizes the number of LUTs by maximizing the volume of each cut and by several postprocessing operations. Based on these results, we have implemented an LUTbased FPGA mapping package called FlowMap. We have tested FlowMap on a large set of benchmark examples and compared it with other LUTbased FPGA mapping algorithms for delay optimization, including Chortled, MISpgadelay, and DAGMap. FlowMap reduces the LUT network depth by up to 7% and reduces the number of LUTs by up to 50% compared to the three previous methods.
Discovery of Inference Rules for Question Answering
 Natural Language Engineering
, 2001
"... One of the main challenges in questionanswering is the potential mismatch between the expressions in questions and the expressions in texts. While humans appear to use inference rules such as “X writes Y ” implies “X is the author of Y ” in answering questions, such rules are generally unavailable ..."
Abstract

Cited by 307 (7 self)
 Add to MetaCart
One of the main challenges in questionanswering is the potential mismatch between the expressions in questions and the expressions in texts. While humans appear to use inference rules such as “X writes Y ” implies “X is the author of Y ” in answering questions, such rules are generally unavailable to questionanswering systems due to the inherent difficulty in constructing them. In this paper, we present an unsupervised algorithm for discovering inference rules from text. Our algorithm is based on an extended version of Harris ’ Distributional Hypothesis, which states that words that occurred in the same contexts tend to be similar. Instead of using this hypothesis on words, we apply it to paths in the dependency trees of a parsed corpus. Essentially, if two paths tend to link the same set of words, we hypothesize that their meanings are similar. We use examples to show that our system discovers many inference rules easily missed by humans. 1
On denotational completeness
 Theoretical Computer Science
, 1997
"... The founding idea of linear logic is the duality between A and A ⊥ , with values in ⊥. This idea is at work in the original denotational semantics of linear logic, coherent spaces, but also in the phase semantics of linear logic, where the bilinear ¡ form which induces the duality is nothing but the ..."
Abstract

Cited by 5 (2 self)
 Add to MetaCart
The founding idea of linear logic is the duality between A and A ⊥ , with values in ⊥. This idea is at work in the original denotational semantics of linear logic, coherent spaces, but also in the phase semantics of linear logic, where the bilinear ¡ form which induces the duality is nothing
DEPICTION AND DENOTATION
"... WHAT IS the difference between describing something and depicting it? The answer lies not merely in the difference between words and pictures nor even in the difference between how each type of symbol relates to what it symbolizes. As will be argued, a general account of this distinction must be giv ..."
Abstract
 Add to MetaCart
WHAT IS the difference between describing something and depicting it? The answer lies not merely in the difference between words and pictures nor even in the difference between how each type of symbol relates to what it symbolizes. As will be argued, a general account of this distinction must be given in terms of what distinguishes the symbol systems in which words and pictures operate. In particular, pictorial systems are marked by a feature that I call continuous correlation. In his book Languages of Art 1 Nelson Goodman forcefully argues that depiction or representation is not a property of a picture per se but is relative to the system that it is in. Though providing a general theory of symbol systems (from which this paper borrows), Goodman fails to give an adequate account of what marks pictorial systems. After an examination of his account this paper presents continuous correlation as the distinguishing feature of pictorial systems. In addition there will be noted a phenomenological correlate of this feature, what I call metaphorical identification.
On Denoting Descriptions
 University of Amsterdam
, 1997
"... this paper we want to show that the notions of information that have become fashionable in various recently employed systems of dynamic semantics, are very useful for the characterization of the referential interpretation of definite descriptions, and the specific interpretation of indefinite ones. ..."
Abstract
 Add to MetaCart
this paper we want to show that the notions of information that have become fashionable in various recently employed systems of dynamic semantics, are very useful for the characterization of the referential interpretation of definite descriptions, and the specific interpretation of indefinite ones. With this more or less trivial observation, we hope to serve a more general goal. To our opinion, current linguistic theorizing fails a clear and generally accepted picture concerning the delineation of the semantic and pragmatic components of a comprising theory of interpretation. In addition, we think, we nowadays fail a convincing conception of the interaction of these two components of the whole theory, even where we were to have consensus about their delineation in certain areas of work. With the work reported in this paper we want to bring back the issue of the semantics  pragmatics interface back on the agenda again. Employing tools from dynamic semantics, viz., its notions of information, we present a separate, but combined, semantic  pragmatic treatment of definite and indefinite noun phrases. Basically, we adopt a Russellian treatment of these descriptions on the semantic side, thereby (i) assuming some form of context dependent quantification and (ii) adopting a dynamicstyle constructive reading of the (implicit) existential quantifiers. A modification of Russells theory along the first lines is obviously needed in order to get rid of too strong uniqueness requirements in the case of definites (and of too weak readings of indefinites). The modification along the second lines is key to this paper. Precisely the use of dynamic semantic `discourse referents'or `subjects' as we call them hereprovides us with a hook to attach the pragmatics of these noun phra...
We denote
"... Acu Dedicated to Professor Alexandru Lupas ̧ on the occasion of his 65th birthday In this paper we studied some quadrature formulas witch are obtained using connection between the monosplines and the quadrature formulas. For the remainder term we give some inequalities. ..."
Abstract
 Add to MetaCart
Acu Dedicated to Professor Alexandru Lupas ̧ on the occasion of his 65th birthday In this paper we studied some quadrature formulas witch are obtained using connection between the monosplines and the quadrature formulas. For the remainder term we give some inequalities.
A RiskFactor Model Foundation for RatingsBased Bank Capital Rules
 Journal of Financial Intermediation
, 2003
"... When economic capital is calculated using a portfolio model of credit valueatrisk, the marginal capital requirement for an instrument depends, in general, on the properties of the portfolio in which it is held. By contrast, ratingsbased capital rules, including both the current Basel Accord and i ..."
Abstract

Cited by 283 (1 self)
 Add to MetaCart
. There is no similarly simple way to address violation of the single factor assumption.
Extensible Denotational Language Specifications
 SYMPOSIUM ON THEORETICAL ASPECTS OF COMPUTER SOFTWARE, NUMBER 789 IN LNCS
, 1994
"... Traditional denotational semantics assigns radically different meanings to one and the same phrase depending on the rest of the programming language. If the language is purely functional, the denotation of a numeral is a function from environments to integers. But, in a functional language with impe ..."
Abstract

Cited by 38 (6 self)
 Add to MetaCart
Traditional denotational semantics assigns radically different meanings to one and the same phrase depending on the rest of the programming language. If the language is purely functional, the denotation of a numeral is a function from environments to integers. But, in a functional language
Sense and denotation as algorithm and value
 Lecture Notes in Logic
, 1994
"... In his classic 1892 paper On sense and denotation [12], Frege first contends that in addition to their denotation (reference, Bedeutung), proper names also have a sense (Sinn) “wherein the mode of presentation [of the denotation] is contained. ” Here proper names include common nouns like “the earth ..."
Abstract

Cited by 36 (3 self)
 Add to MetaCart
In his classic 1892 paper On sense and denotation [12], Frege first contends that in addition to their denotation (reference, Bedeutung), proper names also have a sense (Sinn) “wherein the mode of presentation [of the denotation] is contained. ” Here proper names include common nouns like “the
Automated Consistency Checking of Requirements Specifications
, 1996
"... This paper describes a formal analysis technique, called consistency checking, for automatic detection of errors, such as type errors, nondeterminism, missing cases, and circular definitions, in requirements specifications. The technique is designed to analyze requirements specifications expressed i ..."
Abstract

Cited by 268 (33 self)
 Add to MetaCart
This paper describes a formal analysis technique, called consistency checking, for automatic detection of errors, such as type errors, nondeterminism, missing cases, and circular definitions, in requirements specifications. The technique is designed to analyze requirements specifications expressed in the SCR (Software Cost Reduction) tabular notation. As background, the SCR approach to specifying requirements is reviewed. To provide a formal semantics for the SCR notation and a foundation for consistency checking, a formal requirements model is introduced; the model represents a software system as a finite state automaton, which produces externally visible outputs in response to changes in monitored environmental quantities. Results are presented of two experiments which evaluated the utility and sealability of our technique for consistency checking in a realworld avionics application. The role of consistency checking during the requirements phase of software development is discussed.
Results 11  20
of
1,572,319