Results 1  10
of
23,623
Random Oracles are Practical: A Paradigm for Designing Efficient Protocols
, 1995
"... We argue that the random oracle model  where all parties have access to a public random oracle  provides a bridge between cryptographic theory and cryptographic practice. In the paradigm we suggest, a practical protocol P is produced by first devising and proving correct a protocol P R for the ..."
Abstract

Cited by 1643 (75 self)
 Add to MetaCart
We argue that the random oracle model  where all parties have access to a public random oracle  provides a bridge between cryptographic theory and cryptographic practice. In the paradigm we suggest, a practical protocol P is produced by first devising and proving correct a protocol P R for the random oracle model, and then replacing oracle accesses by the computation of an "appropriately chosen" function h. This paradigm yields protocols much more efficient than standard ones while retaining many of the advantages of provable security. We illustrate these gains for problems including encryption, signatures, and zeroknowledge proofs.
Automatic verification of finitestate concurrent systems using temporal logic specifications
 ACM Transactions on Programming Languages and Systems
, 1986
"... We give an efficient procedure for verifying that a finitestate concurrent system meets a specification expressed in a (propositional, branchingtime) temporal logic. Our algorithm has complexity linear in both the size of the specification and the size of the global state graph for the concurrent ..."
Abstract

Cited by 1384 (62 self)
 Add to MetaCart
We give an efficient procedure for verifying that a finitestate concurrent system meets a specification expressed in a (propositional, branchingtime) temporal logic. Our algorithm has complexity linear in both the size of the specification and the size of the global state graph for the concurrent system. We also show how this approach can be adapted to handle fairness. We argue that our technique can provide a practical alternative to manual proof construction or use of a mechanical theorem prover for verifying many finitestate concurrent systems. Experimental results show that state machines with several hundred states can be checked in a matter of seconds.
An Experimental Comparison of MinCut/MaxFlow Algorithms for Energy Minimization in Vision
 IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE
, 2001
"... After [10, 15, 12, 2, 4] minimum cut/maximum flow algorithms on graphs emerged as an increasingly useful tool for exact or approximate energy minimization in lowlevel vision. The combinatorial optimization literature provides many mincut/maxflow algorithms with different polynomial time compl ..."
Abstract

Cited by 1311 (54 self)
 Add to MetaCart
After [10, 15, 12, 2, 4] minimum cut/maximum flow algorithms on graphs emerged as an increasingly useful tool for exact or approximate energy minimization in lowlevel vision. The combinatorial optimization literature provides many mincut/maxflow algorithms with different polynomial time complexity. Their practical efficiency, however, has to date been studied mainly outside the scope of computer vision. The goal of this paper
Graphical models, exponential families, and variational inference
, 2008
"... The formalism of probabilistic graphical models provides a unifying framework for capturing complex dependencies among random variables, and building largescale multivariate statistical models. Graphical models have become a focus of research in many statistical, computational and mathematical fiel ..."
Abstract

Cited by 800 (26 self)
 Add to MetaCart
The formalism of probabilistic graphical models provides a unifying framework for capturing complex dependencies among random variables, and building largescale multivariate statistical models. Graphical models have become a focus of research in many statistical, computational and mathematical fields, including bioinformatics, communication theory, statistical physics, combinatorial optimization, signal and image processing, information retrieval and statistical machine learning. Many problems that arise in specific instances — including the key problems of computing marginals and modes of probability distributions — are best studied in the general setting. Working with exponential family representations, and exploiting the conjugate duality between the cumulant function and the entropy for exponential families, we develop general variational representations of the problems of computing likelihoods, marginal probabilities and most probable configurations. We describe how a wide varietyof algorithms — among them sumproduct, cluster variational methods, expectationpropagation, mean field methods, maxproduct and linear programming relaxation, as well as conic programming relaxations — can all be understood in terms of exact or approximate forms of these variational representations. The variational approach provides a complementary alternative to Markov chain Monte Carlo as a general source of approximation methods for inference in largescale statistical models.
KodairaSpencer theory of gravity and exact results for quantum string amplitudes
 Commun. Math. Phys
, 1994
"... We develop techniques to compute higher loop string amplitudes for twisted N = 2 theories with ĉ = 3 (i.e. the critical case). An important ingredient is the discovery of an anomaly at every genus in decoupling of BRST trivial states, captured to all orders by a master anomaly equation. In a particu ..."
Abstract

Cited by 545 (60 self)
 Add to MetaCart
We develop techniques to compute higher loop string amplitudes for twisted N = 2 theories with ĉ = 3 (i.e. the critical case). An important ingredient is the discovery of an anomaly at every genus in decoupling of BRST trivial states, captured to all orders by a master anomaly equation. In a particular realization of the N = 2 theories, the resulting string field theory is equivalent to a topological theory in six dimensions, the Kodaira– Spencer theory, which may be viewed as the closed string analog of the Chern–Simon theory. Using the mirror map this leads to computation of the ‘number ’ of holomorphic curves of higher genus curves in Calabi–Yau manifolds. It is shown that topological amplitudes can also be reinterpreted as computing corrections to superpotential terms appearing in the effective 4d theory resulting from compactification of standard 10d superstrings on the corresponding N = 2 theory. Relations with c = 1 strings are also pointed out.
Transfer of Cognitive Skill
, 1989
"... A framework for skill acquisition is proposed that includes two major stages in the development of a cognitive skill: a declarative stage in which facts about the skill domain are interpreted and a procedural stage in which the domain knowledge is directly embodied in procedures for performing the s ..."
Abstract

Cited by 869 (21 self)
 Add to MetaCart
A framework for skill acquisition is proposed that includes two major stages in the development of a cognitive skill: a declarative stage in which facts about the skill domain are interpreted and a procedural stage in which the domain knowledge is directly embodied in procedures for performing the skill. This general framework has been instantiated in the ACT system in which facts are encoded in a propositional network and procedures are encoded as productions. Knowledge compilation is the process by which the skill transits from the declarative stage to the procedural stage. It consists of the subprocesses of composition, which collapses sequences of productions into single productions, and proceduralization, which embeds factual knowledge into productions. Once proceduralized, further learning processes operate on the skill to make the productions more selective in their range of applications. These processes include generalization, discrimination, and strengthening of productions. Comparisons are made to similar concepts from past learning theories. How these learning mechanisms apply to produce the power law speedup in processing time with practice is discussed. It requires at least 100 hours of learning and practice to acquire any significant cognitive skill to a reasonable degree of proficiency. For instance, after 100 hours a student learning to program a computer has achieved only a very modest facility in the skill. Learning one's primary language takes tens of thousands of hours. The psychology of human learning has been very thin in ideas about what happens to skills under the impact of this amount of learning—and for obvious reasons. This article presents a theory about the changes in the nature of a skill over such large time scales and about the basic learning processes that are responsible.
Results 1  10
of
23,623