• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Advanced Search Include Citations

Tools

Sorted by:
Try your query at:
Semantic Scholar Scholar Academic
Google Bing DBLP
Results 1 - 10 of 4,293,853
Next 10 →

A Note On the Unification of the Akaike Information Criterion

by Peide Shi, P. R. China, Chih-ling Tsai - J. Royal Statist. Soc. (B , 1998
"... this paper is first to propose a generalized Kullback-Leibler information that can measure the discrepancy between a robust function evaluated under both the true model and fitted models. Next, we use this generalized Kullback-Leibler information to obtain three generalized Akaike information criter ..."
Abstract - Cited by 4 (1 self) - Add to MetaCart
this paper is first to propose a generalized Kullback-Leibler information that can measure the discrepancy between a robust function evaluated under both the true model and fitted models. Next, we use this generalized Kullback-Leibler information to obtain three generalized Akaike information

Kullback-Leibler Boosting

by Ce Liu , Hueng-Yueng Shum
"... In this paper, we develop a general classification framework called Kullback-Leibler Boosting, or KLBoosting. KLBoosting ..."
Abstract - Cited by 69 (2 self) - Add to MetaCart
In this paper, we develop a general classification framework called Kullback-Leibler Boosting, or KLBoosting. KLBoosting

Kullback-Leibler Information Criterion

by Raffaella Giacomini, Barbara Rossi, We Also Thank B. Hansen, M. Jacoviello, U. Muller, M. Del Negro, G. Primiceri, T. Zha , 2007
"... We propose new methods for analyzing the relative performance of two competing, misspecified models in the presence of possible data instability. The main idea is to develop a measure of the relative “local performance ” for the two models, and to investigate its stability over time by means of stat ..."
Abstract - Add to MetaCart
of statistical tests. The models ’ performance can be evaluated using either in-sample or out-of-sample criteria. In the former case, we suggest using the local Kullback-Leibler information criterion, whereas in the latter, we consider the local out-of-sample forecast loss, for a general loss function. We

Symmetrizing the Kullback-Leibler Distance

by Don H. Johnson, Sinan Sinanovic - IEEE Transactions on Information Theory , 2000
"... We define a new distance measure the resistor-average distance between two probability distributions that is closely related to the Kullback-Leibler distance. While the KullbackLeibler distance is asymmetric in the two distributions, the resistor-average distance is not. It arises from geometric ..."
Abstract - Cited by 42 (0 self) - Add to MetaCart
information-theoretic "distance" measure from a viewpoint of theory. If p 0 , p 1 are two probability densities, the KullbackLeibler distance is defined to be D(p 1 #p 0 )= # p 1 (x)log p 1 (x) p 0 (x) dx . (1) In this paper, log() has base two. The Kullback-Leibler distance is but one example

Kullback-Leibler designs

by Astrid Jourdana, Jessica Francob
"... Space filling designs are commonly used for selecting the input values of time-consuming computer codes. In this paper, the Kullback-Leibler information is used to spread the design points evenly throughout the experimental region. A comparison with the most common designs used for computer experime ..."
Abstract - Add to MetaCart
Space filling designs are commonly used for selecting the input values of time-consuming computer codes. In this paper, the Kullback-Leibler information is used to spread the design points evenly throughout the experimental region. A comparison with the most common designs used for computer

proxy of the Kullback-Leibler distance.

by Ronaldo Dias, Nancy L. Garcia
"... estimator for basis selection based on a ..."
Abstract - Add to MetaCart
estimator for basis selection based on a

Edgeworth expansions of the Kullback-Leibler information

by Jen-Jen Lin, Naoki Saito, Richard A. Levine , 1999
"... ..."
Abstract - Cited by 2 (2 self) - Add to MetaCart
Abstract not found

The centroid of the symmetrical Kullback-Leibler distance

by Raymond Veldhuis - IEEE signal processing letters , 2002
"... Abstract—This paper discusses the computation of the centroid induced by the symmetrical Kullback–Leibler distance. It is shown that it is the unique zeroing argument of a function which only depends on the arithmetic and the normalized geometric mean of the cluster. An efficient algorithm for its c ..."
Abstract - Cited by 13 (0 self) - Add to MetaCart
Abstract—This paper discusses the computation of the centroid induced by the symmetrical Kullback–Leibler distance. It is shown that it is the unique zeroing argument of a function which only depends on the arithmetic and the normalized geometric mean of the cluster. An efficient algorithm for its

Rényi Divergence and Kullback-Leibler Divergence

by Tim Erven
"... Abstract—Rényi divergence is related to Rényi entropy much like Kullback-Leibler divergence is related to Shannon’s entropy, and comes up in many settings. It was introduced by Rényi as a measure of information that satisfies almost the same axioms as Kullback-Leibler divergence, and depends on a ..."
Abstract - Cited by 1 (0 self) - Add to MetaCart
Abstract—Rényi divergence is related to Rényi entropy much like Kullback-Leibler divergence is related to Shannon’s entropy, and comes up in many settings. It was introduced by Rényi as a measure of information that satisfies almost the same axioms as Kullback-Leibler divergence, and depends

On the symmetrical Kullback-Leibler Jeffreys centroids

by Frank Nielsen
"... Due to the success of the bag-of-word modeling paradigm, clustering histograms has become an important ingredient of modern information processing. Clustering histograms can be performed using the celebrated k-means centroid-based algorithm. From the viewpoint of applications, it is usually required ..."
Abstract - Add to MetaCart
required to deal with symmetric distances. In this letter, we consider the Jeffreys divergence that symmetrizes the Kullback-Leibler divergence, and investigate the computation of Jeffreys centroids. We first prove that the Jeffreys centroid can be expressed analytically using the Lambert W function
Next 10 →
Results 1 - 10 of 4,293,853
Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University