Results 1  10
of
669,763
Algorithms for Nonnegative Matrix Factorization
 In NIPS
, 2001
"... Nonnegative matrix factorization (NMF) has previously been shown to be a useful decomposition for multivariate data. Two different multiplicative algorithms for NMF are analyzed. They differ only slightly in the multiplicative factor used in the update rules. One algorithm can be shown to minim ..."
Abstract

Cited by 1230 (5 self)
 Add to MetaCart
to minimize the conventional least squares error while the other minimizes the generalized KullbackLeibler divergence. The monotonic convergence of both algorithms can be proven using an auxiliary function analogous to that used for proving convergence of the ExpectationMaximization algorithm
KullbackLeibler Boosting
"... In this paper, we develop a general classification framework called KullbackLeibler Boosting, or KLBoosting. KLBoosting ..."
Abstract

Cited by 69 (2 self)
 Add to MetaCart
In this paper, we develop a general classification framework called KullbackLeibler Boosting, or KLBoosting. KLBoosting
Rényi Divergence and KullbackLeibler Divergence
"... Abstract—Rényi divergence is related to Rényi entropy much like KullbackLeibler divergence is related to Shannon’s entropy, and comes up in many settings. It was introduced by Rényi as a measure of information that satisfies almost the same axioms as KullbackLeibler divergence, and depends on a ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Abstract—Rényi divergence is related to Rényi entropy much like KullbackLeibler divergence is related to Shannon’s entropy, and comes up in many settings. It was introduced by Rényi as a measure of information that satisfies almost the same axioms as KullbackLeibler divergence, and depends
Weighted nonnegative matrix factorization and face feature extraction
 In Image and Vision Computing
, 2008
"... In this paper we consider weighted nonnegative matrix factorizations and we show that the popular algorithms of Lee and Seung can incorporate such a weighting. We then prove that for appropriately chosen weighting matrices, the weighted Euclidean distance function and the weighted generalized Kullb ..."
Abstract

Cited by 12 (0 self)
 Add to MetaCart
, Euclidean distance, generalized KullbackLeibler divergence
Symmetrizing the KullbackLeibler Distance
 IEEE Transactions on Information Theory
, 2000
"... We define a new distance measure the resistoraverage distance between two probability distributions that is closely related to the KullbackLeibler distance. While the KullbackLeibler distance is asymmetric in the two distributions, the resistoraverage distance is not. It arises from geometric ..."
Abstract

Cited by 42 (0 self)
 Add to MetaCart
We define a new distance measure the resistoraverage distance between two probability distributions that is closely related to the KullbackLeibler distance. While the KullbackLeibler distance is asymmetric in the two distributions, the resistoraverage distance is not. It arises from geometric
The centroid of the symmetrical KullbackLeibler distance
 IEEE signal processing letters
, 2002
"... Abstract—This paper discusses the computation of the centroid induced by the symmetrical Kullback–Leibler distance. It is shown that it is the unique zeroing argument of a function which only depends on the arithmetic and the normalized geometric mean of the cluster. An efficient algorithm for its c ..."
Abstract

Cited by 13 (0 self)
 Add to MetaCart
Abstract—This paper discusses the computation of the centroid induced by the symmetrical Kullback–Leibler distance. It is shown that it is the unique zeroing argument of a function which only depends on the arithmetic and the normalized geometric mean of the cluster. An efficient algorithm for its
The KullbackLeibler Divergence Rate between Markov Sources
 IEEE Trans. Information Theory
, 2004
"... Abstract—In this work, we provide a computable expression for the Kullback–Leibler divergence rate lim ( ) between two timeinvariant finitealphabet Markov sources of arbitrary order and arbitrary initial distributions described by the probability distributions and, respectively. We illustrate it n ..."
Abstract

Cited by 24 (1 self)
 Add to MetaCart
Abstract—In this work, we provide a computable expression for the Kullback–Leibler divergence rate lim ( ) between two timeinvariant finitealphabet Markov sources of arbitrary order and arbitrary initial distributions described by the probability distributions and, respectively. We illustrate
On the symmetrical KullbackLeibler Jeffreys centroids
"... Due to the success of the bagofword modeling paradigm, clustering histograms has become an important ingredient of modern information processing. Clustering histograms can be performed using the celebrated kmeans centroidbased algorithm. From the viewpoint of applications, it is usually required ..."
Abstract
 Add to MetaCart
required to deal with symmetric distances. In this letter, we consider the Jeffreys divergence that symmetrizes the KullbackLeibler divergence, and investigate the computation of Jeffreys centroids. We first prove that the Jeffreys centroid can be expressed analytically using the Lambert W function
KullbackLeibler Divergence Estimation of Continuous Distributions
 Proceedings of IEEE International Symposium on Information Theory
, 2008
"... Abstract—We present a method for estimating the KL divergence between continuous densities and we prove it converges almost surely. Divergence estimation is typically solved estimating the densities first. Our main result shows this intermediate step is unnecessary and that the divergence can be eit ..."
Abstract

Cited by 18 (0 self)
 Add to MetaCart
Abstract—We present a method for estimating the KL divergence between continuous densities and we prove it converges almost surely. Divergence estimation is typically solved estimating the densities first. Our main result shows this intermediate step is unnecessary and that the divergence can
Results 1  10
of
669,763