Results 1  10
of
809,854
The Hungarian method for the assignment problem
 Naval Res. Logist. Quart
, 1955
"... Assuming that numerical scores are available for the performance of each of n persons on each of n jobs, the "assignment problem" is the quest for an assignment of persons to jobs so that the sum of the n scores so obtained is as large as possible. It is shown that ideas latent in the work ..."
Abstract

Cited by 1259 (0 self)
 Add to MetaCart
in the work of two Hungarian mathematicians may be exploited to yield a new method of solving this problem. 1.
The information bottleneck method
, 1999
"... We define the relevant information in a signal x ∈ X as being the information that this signal provides about another signal y ∈ Y. Examples include the information that face images provide about the names of the people portrayed, or the information that speech sounds provide about the words spoken. ..."
Abstract

Cited by 540 (35 self)
 Add to MetaCart
about Y through a ‘bottleneck ’ formed by a limited set of codewords ˜X. This constrained optimization problem can be seen as a generalization of rate distortion theory in which the distortion measure d(x, ˜x) emerges from the joint statistics of X and Y. This approach yields an exact set of self
Learning with local and global consistency.
 In NIPS,
, 2003
"... Abstract We consider the general problem of learning from labeled and unlabeled data, which is often called semisupervised learning or transductive inference. A principled approach to semisupervised learning is to design a classifying function which is sufficiently smooth with respect to the intr ..."
Abstract

Cited by 673 (21 self)
 Add to MetaCart
to the intrinsic structure collectively revealed by known labeled and unlabeled points. We present a simple algorithm to obtain such a smooth solution. Our method yields encouraging experimental results on a number of classification problems and demonstrates effective use of unlabeled data.
Accurate Methods for the Statistics of Surprise and Coincidence
 COMPUTATIONAL LINGUISTICS
, 1993
"... Much work has been done on the statistical analysis of text. In some cases reported in the literature, inappropriate statistical methods have been used, and statistical significance of results have not been addressed. In particular, asymptotic normality assumptions have often been used unjustifiably ..."
Abstract

Cited by 1057 (1 self)
 Add to MetaCart
unjustifiably, leading to flawed results.This assumption of normal distribution limits the ability to analyze rare events. Unfortunately rare events do make up a large fraction of real text.However, more applicable methods based on likelihood ratio tests are available that yield good results with relatively
Estimating standard errors in finance panel data sets: comparing approaches.
 Review of Financial Studies
, 2009
"... Abstract In both corporate finance and asset pricing empirical work, researchers are often confronted with panel data. In these data sets, the residuals may be correlated across firms and across time, and OLS standard errors can be biased. Historically, the two literatures have used different solut ..."
Abstract

Cited by 890 (7 self)
 Add to MetaCart
solutions to this problem. Corporate finance has relied on clustered standard errors, while asset pricing has used the FamaMacBeth procedure to estimate standard errors. This paper examines the different methods used in the literature and explains when the different methods yield the same (and correct
Probabilistic Outputs for Support Vector Machines and Comparisons to Regularized Likelihood Methods
 ADVANCES IN LARGE MARGIN CLASSIFIERS
, 1999
"... The output of a classifier should be a calibrated posterior probability to enable postprocessing. Standard SVMs do not provide such probabilities. One method to create probabilities is to directly train a kernel classifier with a logit link function and a regularized maximum likelihood score. Howev ..."
Abstract

Cited by 1051 (0 self)
 Add to MetaCart
sigmoid versus a kernel method trained with a regularized likelihood error function. These methods are tested on three dataminingstyle data sets. The SVM+sigmoid yields probabilities of comparable quality to the regularized maximum likelihood kernel method, while still retaining the sparseness
A volumetric method for building complex models from range images,”
 in Proceedings of the 23rd annual conference on Computer graphics and interactive techniques. ACM,
, 1996
"... Abstract A number of techniques have been developed for reconstructing surfaces by integrating groups of aligned range images. A desirable set of properties for such algorithms includes: incremental updating, representation of directional uncertainty, the ability to fill gaps in the reconstruction, ..."
Abstract

Cited by 1020 (17 self)
 Add to MetaCart
the boundaries between regions seen to be empty and regions never observed. Using this method, we are able to integrate a large number of range images (as many as 70) yielding seamless, highdetail models of up to 2.6 million triangles.
Numerical Solutions of the Euler Equations by Finite Volume Methods Using RungeKutta TimeStepping Schemes
, 1981
"... A new combination of a finite volume discretization in conjunction with carefully designed dissipative terms of third order, and a Runge Kutta time stepping scheme, is shown to yield an effective method for solving the Euler equations in arbitrary geometric domains. The method has been used to deter ..."
Abstract

Cited by 517 (78 self)
 Add to MetaCart
A new combination of a finite volume discretization in conjunction with carefully designed dissipative terms of third order, and a Runge Kutta time stepping scheme, is shown to yield an effective method for solving the Euler equations in arbitrary geometric domains. The method has been used
Gene selection for cancer classification using support vector machines
 Machine Learning
"... Abstract. DNA microarrays now permit scientists to screen thousands of genes simultaneously and determine whether those genes are active, hyperactive or silent in normal or cancerous tissue. Because these new microarray devices generate bewildering amounts of raw data, new analytical methods must ..."
Abstract

Cited by 1115 (24 self)
 Add to MetaCart
based on Recursive Feature Elimination (RFE). We demonstrate experimentally that the genes selected by our techniques yield better classification performance and are biologically relevant to cancer. In contrast with the baseline method, our method eliminates gene redundancy automatically and yields
Approximating discrete probability distributions with dependence trees
 IEEE TRANSACTIONS ON INFORMATION THEORY
, 1968
"... A method is presented to approximate optimally an ndimensional discrete probability distribution by a product of secondorder distributions, or the distribution of the firstorder tree dependence. The problem is to find an optimum set of n1 first order dependence relationship among the n variables ..."
Abstract

Cited by 881 (0 self)
 Add to MetaCart
A method is presented to approximate optimally an ndimensional discrete probability distribution by a product of secondorder distributions, or the distribution of the firstorder tree dependence. The problem is to find an optimum set of n1 first order dependence relationship among the n
Results 1  10
of
809,854