Results 1 - 10
of
4,293,853
A Note On the Unification of the Akaike Information Criterion
- J. Royal Statist. Soc. (B
, 1998
"... this paper is first to propose a generalized Kullback-Leibler information that can measure the discrepancy between a robust function evaluated under both the true model and fitted models. Next, we use this generalized Kullback-Leibler information to obtain three generalized Akaike information criter ..."
Abstract
-
Cited by 4 (1 self)
- Add to MetaCart
this paper is first to propose a generalized Kullback-Leibler information that can measure the discrepancy between a robust function evaluated under both the true model and fitted models. Next, we use this generalized Kullback-Leibler information to obtain three generalized Akaike information
Kullback-Leibler Boosting
"... In this paper, we develop a general classification framework called Kullback-Leibler Boosting, or KLBoosting. KLBoosting ..."
Abstract
-
Cited by 69 (2 self)
- Add to MetaCart
In this paper, we develop a general classification framework called Kullback-Leibler Boosting, or KLBoosting. KLBoosting
Kullback-Leibler Information Criterion
, 2007
"... We propose new methods for analyzing the relative performance of two competing, misspecified models in the presence of possible data instability. The main idea is to develop a measure of the relative “local performance ” for the two models, and to investigate its stability over time by means of stat ..."
Abstract
- Add to MetaCart
of statistical tests. The models ’ performance can be evaluated using either in-sample or out-of-sample criteria. In the former case, we suggest using the local Kullback-Leibler information criterion, whereas in the latter, we consider the local out-of-sample forecast loss, for a general loss function. We
Symmetrizing the Kullback-Leibler Distance
- IEEE Transactions on Information Theory
, 2000
"... We define a new distance measure the resistor-average distance between two probability distributions that is closely related to the Kullback-Leibler distance. While the KullbackLeibler distance is asymmetric in the two distributions, the resistor-average distance is not. It arises from geometric ..."
Abstract
-
Cited by 42 (0 self)
- Add to MetaCart
information-theoretic "distance" measure from a viewpoint of theory. If p 0 , p 1 are two probability densities, the KullbackLeibler distance is defined to be D(p 1 #p 0 )= # p 1 (x)log p 1 (x) p 0 (x) dx . (1) In this paper, log() has base two. The Kullback-Leibler distance is but one example
Kullback-Leibler designs
"... Space filling designs are commonly used for selecting the input values of time-consuming computer codes. In this paper, the Kullback-Leibler information is used to spread the design points evenly throughout the experimental region. A comparison with the most common designs used for computer experime ..."
Abstract
- Add to MetaCart
Space filling designs are commonly used for selecting the input values of time-consuming computer codes. In this paper, the Kullback-Leibler information is used to spread the design points evenly throughout the experimental region. A comparison with the most common designs used for computer
The centroid of the symmetrical Kullback-Leibler distance
- IEEE signal processing letters
, 2002
"... Abstract—This paper discusses the computation of the centroid induced by the symmetrical Kullback–Leibler distance. It is shown that it is the unique zeroing argument of a function which only depends on the arithmetic and the normalized geometric mean of the cluster. An efficient algorithm for its c ..."
Abstract
-
Cited by 13 (0 self)
- Add to MetaCart
Abstract—This paper discusses the computation of the centroid induced by the symmetrical Kullback–Leibler distance. It is shown that it is the unique zeroing argument of a function which only depends on the arithmetic and the normalized geometric mean of the cluster. An efficient algorithm for its
Rényi Divergence and Kullback-Leibler Divergence
"... Abstract—Rényi divergence is related to Rényi entropy much like Kullback-Leibler divergence is related to Shannon’s entropy, and comes up in many settings. It was introduced by Rényi as a measure of information that satisfies almost the same axioms as Kullback-Leibler divergence, and depends on a ..."
Abstract
-
Cited by 1 (0 self)
- Add to MetaCart
Abstract—Rényi divergence is related to Rényi entropy much like Kullback-Leibler divergence is related to Shannon’s entropy, and comes up in many settings. It was introduced by Rényi as a measure of information that satisfies almost the same axioms as Kullback-Leibler divergence, and depends
On the symmetrical Kullback-Leibler Jeffreys centroids
"... Due to the success of the bag-of-word modeling paradigm, clustering histograms has become an important ingredient of modern information processing. Clustering histograms can be performed using the celebrated k-means centroid-based algorithm. From the viewpoint of applications, it is usually required ..."
Abstract
- Add to MetaCart
required to deal with symmetric distances. In this letter, we consider the Jeffreys divergence that symmetrizes the Kullback-Leibler divergence, and investigate the computation of Jeffreys centroids. We first prove that the Jeffreys centroid can be expressed analytically using the Lambert W function
Results 1 - 10
of
4,293,853