Results 1  10
of
2,901,306
MODEL REFERENCE CONTROL IN INVENTORY AND SUPPLY CHAIN MANAGEMENT The implementation of a more suitable cost function
"... Abstract: A method of model reference control is investigated in this study in order to present a more suitable method of controlling an inventory or a supply chain. The problem of difficult determining of the cost of change made in the control in supply chain related systems is studied and a soluti ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
Abstract: A method of model reference control is investigated in this study in order to present a more suitable method of controlling an inventory or a supply chain. The problem of difficult determining of the cost of change made in the control in supply chain related systems is studied and a
The ratedistortion function for source coding with side information at the decoder
 IEEE Trans. Inform. Theory
, 1976
"... AbstractLet {(X,, Y,J}r = 1 be a sequence of independent drawings of a pair of dependent random variables X, Y. Let us say that X takes values in the finite set 6. It is desired to encode the sequence {X,} in blocks of length n into a binary stream*of rate R, which can in turn be decoded as a seque ..."
Abstract

Cited by 1048 (1 self)
 Add to MetaCart
the quantity R*(d). defined as the infimum of rates R such that (with E> 0 arbitrarily small and with suitably large n) communication is possible in the above setting at an average distortion level (as defined above) not exceeding d + E. The main result is that R*(d) = inf[Z(X,Z) Z(Y,Z)], where
Learning to rank using gradient descent
 In ICML
, 2005
"... We investigate using gradient descent methods for learning ranking functions; we propose a simple probabilistic cost function, and we introduce RankNet, an implementation of these ideas using a neural network to model the underlying ranking function. We present test results on toy data and on data f ..."
Abstract

Cited by 513 (17 self)
 Add to MetaCart
We investigate using gradient descent methods for learning ranking functions; we propose a simple probabilistic cost function, and we introduce RankNet, an implementation of these ideas using a neural network to model the underlying ranking function. We present test results on toy data and on data
Information Theory and Statistics
, 1968
"... Entropy and relative entropy are proposed as features extracted from symbol sequences. Firstly, a proper Iterated Function System is driven by the sequence, producing a fractaMike representation (CSR) with a low computational cost. Then, two entropic measures are applied to the CSR histogram of th ..."
Abstract

Cited by 1764 (2 self)
 Add to MetaCart
Entropy and relative entropy are proposed as features extracted from symbol sequences. Firstly, a proper Iterated Function System is driven by the sequence, producing a fractaMike representation (CSR) with a low computational cost. Then, two entropic measures are applied to the CSR histogram
Bundle Adjustment  A Modern Synthesis
 VISION ALGORITHMS: THEORY AND PRACTICE, LNCS
, 2000
"... This paper is a survey of the theory and methods of photogrammetric bundle adjustment, aimed at potential implementors in the computer vision community. Bundle adjustment is the problem of refining a visual reconstruction to produce jointly optimal structure and viewing parameter estimates. Topics c ..."
Abstract

Cited by 556 (12 self)
 Add to MetaCart
covered include: the choice of cost function and robustness; numerical optimization including sparse Newton methods, linearly convergent approximations, updating and recursive methods; gauge (datum) invariance; and quality control. The theory is developed for general robust cost functions rather than
EndToEnd Arguments In System Design
, 1984
"... This paper presents a design principle that helps guide placement of functions among the modules of a distributed computer system. The principle, called the endtoend argument, suggests that functions placed at low levels of a system may be redundant or of little value when compared with the cost o ..."
Abstract

Cited by 1026 (9 self)
 Add to MetaCart
This paper presents a design principle that helps guide placement of functions among the modules of a distributed computer system. The principle, called the endtoend argument, suggests that functions placed at low levels of a system may be redundant or of little value when compared with the cost
Learning quickly when irrelevant attributes abound: A new linearthreshold algorithm
 Machine Learning
, 1988
"... learning Boolean functions, linearthreshold algorithms Abstract. Valiant (1984) and others have studied the problem of learning various classes of Boolean functions from examples. Here we discuss incremental learning of these functions. We consider a setting in which the learner responds to each ex ..."
Abstract

Cited by 766 (5 self)
 Add to MetaCart
example according to a current hypothesis. Then the learner updates the hypothesis, if necessary, based on the correct classification of the example. One natural measure of the quality of learning in this setting is the number of mistakes the learner makes. For suitable classes of functions, learning
A Theory of Diagnosis from First Principles
 ARTIFICIAL INTELLIGENCE
, 1987
"... Suppose one is given a description of a system, together with an observation of the system's behaviour which conflicts with the way the system is meant to behave. The diagnostic problem is to determine those components of the system which, when assumed to be functioning abnormally, will explain ..."
Abstract

Cited by 1107 (5 self)
 Add to MetaCart
Suppose one is given a description of a system, together with an observation of the system's behaviour which conflicts with the way the system is meant to behave. The diagnostic problem is to determine those components of the system which, when assumed to be functioning abnormally
EntropyBased Algorithms For Best Basis Selection
 IEEE Transactions on Information Theory
, 1992
"... pretations (position, frequency, and scale), and we have experimented with featureextraction methods that use bestbasis compression for frontend complexity reduction. The method relies heavily on the remarkable orthogonality properties of the new libraries. It is obviously a nonlinear transformat ..."
Abstract

Cited by 670 (20 self)
 Add to MetaCart
, we can use information cost functionals defined for signals with normalized energy, since all expansions in a given library will conserve energy. Since two expansions will have the same energy globally, it is not necessary to normalize expansions to compare their costs. This feature greatly enlarges
An analysis of transformations
 Journal of the Royal Statistical Society. Series B (Methodological
, 1964
"... In the analysis of data it is often assumed that observations y,, y,,...,y, are independently normally distributed with constant variance and with expectations specified by a model linear in a set of parameters 0. In this paper we make the less restrictive assumption that such a normal, homoscedasti ..."
Abstract

Cited by 1032 (3 self)
 Add to MetaCart
, homoscedastic, linear model is appropriate after some suitable transformation has been applied to the y's. Inferences about the transformation and about the parameters of the linear model are made by computing the likelihood function and the relevant posterior distribution. The contributions of normality
Results 1  10
of
2,901,306