Results 1  10
of
5,840
Mismatched estimation and relative entropy
 IEEE Trans. Inf. Theory
, 2010
"... Abstract—A random variable with distribution is observed in Gaussian noise and is estimated by a mismatched minimum meansquare estimator that assumes that the distribution is, instead of. This paper shows that the integral over all signaltonoise ratios (SNRs) of the excess meansquare estimation e ..."
Abstract

Cited by 25 (3 self)
 Add to MetaCart
error incurred by the mismatched estimator is twice the relative entropy (in nats). This representation of relative entropy can be generalized to nonrealvalued random variables, and can be particularized to give new general representations of mutual information in terms of conditional means. Inspired
zRenyi relative entropies
"... We consider a twoparameter family of Rényi relative entropies D;z() that are quantum generalisations of the classical Rényi divergence D(pq). This family includes many known relative entropies (or divergences) such as the quantum relative entropy, the recently defined quantum Rényi divergenc ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
We consider a twoparameter family of Rényi relative entropies D;z() that are quantum generalisations of the classical Rényi divergence D(pq). This family includes many known relative entropies (or divergences) such as the quantum relative entropy, the recently defined quantum Rényi
Relative Entropy Policy Search
"... Policy search is a successful approach to reinforcement learning. However, policy improvements often result in the loss of information. Hence, it has been marred by premature convergence and implausible solutions. As first suggested in the context of covariant policy gradients (Bagnell and Schneider ..."
Abstract

Cited by 48 (19 self)
 Add to MetaCart
and Schneider 2003), many of these problems may be addressed by constraining the information loss. In this paper, we continue this path of reasoning and suggest the Relative Entropy Policy Search (REPS) method. The resulting method differs significantly from previous policy gradient approaches and yields
Relative entropy in hyperbolic relaxation
 Commun. Math. Sci
"... Abstract. We provide a framework so that hyperbolic relaxation systems are endowed with a relative entropy identity. This allows a direct proof of convergence in the regime that the limiting solution is smooth. 1. ..."
Abstract

Cited by 14 (6 self)
 Add to MetaCart
Abstract. We provide a framework so that hyperbolic relaxation systems are endowed with a relative entropy identity. This allows a direct proof of convergence in the regime that the limiting solution is smooth. 1.
INEQUALITIES FOR QUANTUM RELATIVE ENTROPY
, 2004
"... Some logarithmic trace inequalities involving the notions of relative entropy are reobtained from a logmajorization result. The thermodynamic inequality is generalized and a chain of equivalent statements involving this inequality and the PeierlsBogoliubov inequality is obtained. ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
Some logarithmic trace inequalities involving the notions of relative entropy are reobtained from a logmajorization result. The thermodynamic inequality is generalized and a chain of equivalent statements involving this inequality and the PeierlsBogoliubov inequality is obtained.
Telescopic Relative Entropy
"... Abstract. We introduce the telescopic relative entropy (TRE), which is a new regularisation of the relative entropy related to smoothing, to overcome the problem that the relative entropy between pure states is either zero or infinity and therefore useless as a distance measure in this case. We stud ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
Abstract. We introduce the telescopic relative entropy (TRE), which is a new regularisation of the relative entropy related to smoothing, to overcome the problem that the relative entropy between pure states is either zero or infinity and therefore useless as a distance measure in this case. We
Gaussian relative entropy of entanglement
"... For two gaussian states with given correlation matrices, in order that relative entropy between them is practically calculable, I in this paper describe the ways of transforming the correlation matrix to matrix in the exponential density operator. Gaussian relative entropy of entanglement is propose ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
For two gaussian states with given correlation matrices, in order that relative entropy between them is practically calculable, I in this paper describe the ways of transforming the correlation matrix to matrix in the exponential density operator. Gaussian relative entropy of entanglement
RELATIVE ENTROPY IN DIFFUSIVE RELAXATION
"... Abstract. We establish convergence in the diffusive limit from entropy weak solutions of the equations of compressible gas dynamics with friction to the porous media equation away from vacuum. The result is based on a Lyapunov type of functional provided by a calculation of the relative entropy. The ..."
Abstract

Cited by 4 (2 self)
 Add to MetaCart
Abstract. We establish convergence in the diffusive limit from entropy weak solutions of the equations of compressible gas dynamics with friction to the porous media equation away from vacuum. The result is based on a Lyapunov type of functional provided by a calculation of the relative entropy
Forecasting using relative entropy
 Journal of Money, Credit, and Banking
, 2005
"... Abstract: The paper describes a relative entropy procedure for imposing moment restrictions on simulated forecast distributions from a variety of models. Starting from an empirical forecast distribution for some variables of interest, the technique generates a new empirical distribution that satisfi ..."
Abstract

Cited by 17 (0 self)
 Add to MetaCart
Abstract: The paper describes a relative entropy procedure for imposing moment restrictions on simulated forecast distributions from a variety of models. Starting from an empirical forecast distribution for some variables of interest, the technique generates a new empirical distribution
Relative Entropy and Statistics
, 2008
"... Shannon’s Information Theory (IT) (1948) definitely established the purely mathematical nature of entropy and relative entropy, in contrast to the previous identification by Boltzmann (1872) of his “Hfunctional ” as the physical entropy of earlier thermodynamicians (Carnot, Clausius, Kelvin). The f ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Shannon’s Information Theory (IT) (1948) definitely established the purely mathematical nature of entropy and relative entropy, in contrast to the previous identification by Boltzmann (1872) of his “Hfunctional ” as the physical entropy of earlier thermodynamicians (Carnot, Clausius, Kelvin
Results 1  10
of
5,840