Results 1  10
of
87,538
A Note On the Unification of the Akaike Information Criterion
 J. Royal Statist. Soc. (B
, 1998
"... this paper is first to propose a generalized KullbackLeibler information that can measure the discrepancy between a robust function evaluated under both the true model and fitted models. Next, we use this generalized KullbackLeibler information to obtain three generalized Akaike information criter ..."
Abstract

Cited by 4 (1 self)
 Add to MetaCart
this paper is first to propose a generalized KullbackLeibler information that can measure the discrepancy between a robust function evaluated under both the true model and fitted models. Next, we use this generalized KullbackLeibler information to obtain three generalized Akaike information
KullbackLeibler Information Criterion
, 2007
"... We propose new methods for analyzing the relative performance of two competing, misspecified models in the presence of possible data instability. The main idea is to develop a measure of the relative “local performance ” for the two models, and to investigate its stability over time by means of stat ..."
Abstract
 Add to MetaCart
of statistical tests. The models ’ performance can be evaluated using either insample or outofsample criteria. In the former case, we suggest using the local KullbackLeibler information criterion, whereas in the latter, we consider the local outofsample forecast loss, for a general loss function. We
Rényi Divergence and KullbackLeibler Divergence
"... Abstract—Rényi divergence is related to Rényi entropy much like KullbackLeibler divergence is related to Shannon’s entropy, and comes up in many settings. It was introduced by Rényi as a measure of information that satisfies almost the same axioms as KullbackLeibler divergence, and depends on a ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Abstract—Rényi divergence is related to Rényi entropy much like KullbackLeibler divergence is related to Shannon’s entropy, and comes up in many settings. It was introduced by Rényi as a measure of information that satisfies almost the same axioms as KullbackLeibler divergence, and depends
Bootstrap estimate of KullbackLeibler information for model selection
 Statistica Sinica
, 1997
"... Estimation of KullbackLeibler amount of information is a crucial part of deriving a statistical model selection procedure which is based on likelihood principle like AIC. To discriminate nested models, we have to estimate it up to the order of constant while the KullbackLeibler information itself ..."
Abstract

Cited by 27 (0 self)
 Add to MetaCart
Estimation of KullbackLeibler amount of information is a crucial part of deriving a statistical model selection procedure which is based on likelihood principle like AIC. To discriminate nested models, we have to estimate it up to the order of constant while the KullbackLeibler information itself
Relations between KullbackLeibler distance and Fisher information
, 2002
"... The KullbackLeibler distance between two probability densities that are parametric perturbations of each other is related to the Fisher information. We generalize this relationship to the case when the perturbations may not be small and when the two densities are nonparametric. ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
The KullbackLeibler distance between two probability densities that are parametric perturbations of each other is related to the Fisher information. We generalize this relationship to the case when the perturbations may not be small and when the two densities are nonparametric.
KullbackLeibler approximation of spectral density functions
 IEEE Trans. Inform. Theory
, 2003
"... Abstract—We introduce a Kullback–Leiblertype distance between spectral density functions of stationary stochastic processes and solve the problem of optimal approximation of a given spectral density 9 by one that is consistent with prescribed secondorder statistics. In general, such statistics are ..."
Abstract

Cited by 54 (20 self)
 Add to MetaCart
—Approximation of power spectra, crossentropy minimization, Kullback–Leibler distance, mutual information, optimization, spectral estimation. I.
PLANS D’INFORMATION DE KULLBACKLEIBLER MINIMALE
, 2008
"... Depuis quelques années, la simulation numérique modélise des phénomènes toujours plus complexes. De tels problèmes, généralement de très grande dimension, exigent des codes de simulation sophistiqués et très coûteux en temps ..."
Abstract
 Add to MetaCart
Depuis quelques années, la simulation numérique modélise des phénomènes toujours plus complexes. De tels problèmes, généralement de très grande dimension, exigent des codes de simulation sophistiqués et très coûteux en temps
WaveletBased Texture Retrieval Using Generalized Gaussian Density and KullbackLeibler Distance
 IEEE Trans. Image Processing
, 2002
"... We present a statistical view of the texture retrieval problem by combining the two related tasks, namely feature extraction (FE) and similarity measurement (SM), into a joint modeling and classification scheme. We show that using a consistent estimator of texture model parameters for the FE step fo ..."
Abstract

Cited by 241 (4 self)
 Add to MetaCart
followed by computing the KullbackLeibler distance (KLD) between estimated models for the SM step is asymptotically optimal in term of retrieval error probability. The statistical scheme leads to a new waveletbased texture retrieval method that is based on the accurate modeling of the marginal
KullbackLeibler Boosting
"... In this paper, we develop a general classification framework called KullbackLeibler Boosting, or KLBoosting. KLBoosting ..."
Abstract

Cited by 68 (2 self)
 Add to MetaCart
In this paper, we develop a general classification framework called KullbackLeibler Boosting, or KLBoosting. KLBoosting
Symmetrizing the KullbackLeibler Distance
 IEEE Transactions on Information Theory
, 2000
"... We define a new distance measure the resistoraverage distance between two probability distributions that is closely related to the KullbackLeibler distance. While the KullbackLeibler distance is asymmetric in the two distributions, the resistoraverage distance is not. It arises from geometric ..."
Abstract

Cited by 42 (0 self)
 Add to MetaCart
informationtheoretic "distance" measure from a viewpoint of theory. If p 0 , p 1 are two probability densities, the KullbackLeibler distance is defined to be D(p 1 #p 0 )= # p 1 (x)log p 1 (x) p 0 (x) dx . (1) In this paper, log() has base two. The KullbackLeibler distance is but one example
Results 1  10
of
87,538