Results 1  10
of
25,839
The ratedistortion function for source coding with side information at the decoder
 IEEE Trans. Inform. Theory
, 1976
"... AbstractLet {(X,, Y,J}r = 1 be a sequence of independent drawings of a pair of dependent random variables X, Y. Let us say that X takes values in the finite set 6. It is desired to encode the sequence {X,} in blocks of length n into a binary stream*of rate R, which can in turn be decoded as a seque ..."
Abstract

Cited by 1060 (1 self)
 Add to MetaCart
the infimum is with respect to all auxiliary random variables Z (which take values in a finite set 3) that satisfy: i) Y,Z conditiofally independent given X; ii) there exists a functionf: “Y x E +.%, such that E[D(X,f(Y,Z))] 5 d. Let Rx, y(d) be the ratedistortion function which results when the encoder
Term Rewriting Systems
, 1992
"... Term Rewriting Systems play an important role in various areas, such as abstract data type specifications, implementations of functional programming languages and automated deduction. In this chapter we introduce several of the basic comcepts and facts for TRS's. Specifically, we discuss Abstra ..."
Abstract

Cited by 610 (18 self)
 Add to MetaCart
Term Rewriting Systems play an important role in various areas, such as abstract data type specifications, implementations of functional programming languages and automated deduction. In this chapter we introduce several of the basic comcepts and facts for TRS's. Specifically, we discuss
A Syntactic Approach to Type Soundness
 INFORMATION AND COMPUTATION
, 1992
"... We present a new approach to proving type soundness for Hindley/Milnerstyle polymorphic type systems. The keys to our approach are (1) an adaptation of subject reduction theorems from combinatory logic to programming languages, and (2) the use of rewriting techniques for the specification of the la ..."
Abstract

Cited by 629 (22 self)
 Add to MetaCart
We present a new approach to proving type soundness for Hindley/Milnerstyle polymorphic type systems. The keys to our approach are (1) an adaptation of subject reduction theorems from combinatory logic to programming languages, and (2) the use of rewriting techniques for the specification
EntropyBased Algorithms For Best Basis Selection
 IEEE Transactions on Information Theory
, 1992
"... pretations (position, frequency, and scale), and we have experimented with featureextraction methods that use bestbasis compression for frontend complexity reduction. The method relies heavily on the remarkable orthogonality properties of the new libraries. It is obviously a nonlinear transformat ..."
Abstract

Cited by 675 (20 self)
 Add to MetaCart
pretations (position, frequency, and scale), and we have experimented with featureextraction methods that use bestbasis compression for frontend complexity reduction. The method relies heavily on the remarkable orthogonality properties of the new libraries. It is obviously a nonlinear
Stochastic relaxation, Gibbs distributions and the Bayesian restoration of images.
 IEEE Trans. Pattern Anal. Mach. Intell.
, 1984
"... AbstractWe make an analogy between images and statistical mechanics systems. Pixel gray levels and the presence and orientation of edges are viewed as states of atoms or molecules in a latticelike physical system. The assignment of an energy function in the physical system determines its Gibbs di ..."
Abstract

Cited by 5126 (1 self)
 Add to MetaCart
AbstractWe make an analogy between images and statistical mechanics systems. Pixel gray levels and the presence and orientation of edges are viewed as states of atoms or molecules in a latticelike physical system. The assignment of an energy function in the physical system determines its Gibbs
Toward an instance theory of automatization
 Psychological Review
, 1988
"... This article presents a theory in which automatization is construed as the acquisition of a domainspecific knowledge base, formed of separate representations, instances, of each exposure to the task. Processing is considered automatic if it relies on retrieval of stored instances, which will occur ..."
Abstract

Cited by 647 (38 self)
 Add to MetaCart
up and predicts a powerfunction reduction in the standard deviation that is constrained to have the same exponent as the power function for the speedup. The theory accounts for qualitative properties as well, explaining how some may disappear and others appear with practice. More generally, it provides
The irreducibility of the space of curves of given genus
 Publ. Math. IHES
, 1969
"... Fix an algebraically closed field k. Let Mg be the moduli space of curves of genus g over k. The main result of this note is that Mg is irreducible for every k. Of course, whether or not M s is irreducible depends only on the characteristic of k. When the characteristic s o, we can assume that k ~ ..."
Abstract

Cited by 506 (2 self)
 Add to MetaCart
strengthened his method so that it applies in all characteristics (SGA 7, ~968) 9 Mumford has also given a proof using theta functions in char. ~2. The result is this: Stable Reduction Theorem. Let R be a discrete valuation ring with quotient field K. Let A be an abelian variety over K. Then there exists a
Aging: a theory based on free radical and radiation chemistry
 J Gerontol
, 1956
"... The phenomenon of growth, decline and deathaginghas been the source of considerable speculation (1, 8, 10). This cycle seems to be a more or less direct function of the metabolic rate and this in turn depends on the species (animal or plant) on which are superimposed the factors of heredity and ..."
Abstract

Cited by 637 (2 self)
 Add to MetaCart
The phenomenon of growth, decline and deathaginghas been the source of considerable speculation (1, 8, 10). This cycle seems to be a more or less direct function of the metabolic rate and this in turn depends on the species (animal or plant) on which are superimposed the factors of heredity
The processingspeed theory of adult age differences in cognition
 Psychological Review
, 1996
"... A theory is proposed to account for some of the agerelated differences reported in measures of Type A or fluid cognition. The central hypothesis in the theory is that increased age in adulthood is associated with a decrease in the speed with which many processing operations can be executed and that ..."
Abstract

Cited by 416 (2 self)
 Add to MetaCart
and that this reduction in speed leads to impairments in cognitive functioning because of what are termed the limited time mechanism and the simultaneity mechanism. That is, cognitive performance is degraded when processing is slow because relevant operations cannot be successfully executed (limited time) and because
MetaCost: A General Method for Making Classifiers CostSensitive
 In Proceedings of the Fifth International Conference on Knowledge Discovery and Data Mining
, 1999
"... Research in machine learning, statistics and related fields has produced a wide variety of algorithms for classification. However, most of these algorithms assume that all errors have the same cost, which is seldom the case in KDD prob lems. Individually making each classification learner costsensi ..."
Abstract

Cited by 415 (4 self)
 Add to MetaCart
functioning or change to it. Unlike stratification, MetaCost is applicable to any number of classes and to arbitrary cost matrices. Empirical trials on a large suite of benchmark databases show that MetaCost almost always produces large cost reductions compared to the costblind classifier used (C4.5RULES
Results 1  10
of
25,839