Results 1  10
of
34
Families of Alpha Beta and GammaDivergences: Flexible and Robust Measures of Similarities
, 2010
"... ..."
A variational framework for nonlocal image inpainting
 PROC. OF EMMCVPR
, 2009
"... Nonlocal methods for image denoising and inpainting have gained considerable attention in recent years. This is in part due to their superior performance in textured images and regions, a known weakness of purely local methods. Local methods on the other hand have demonstrated to be very appropriat ..."
Abstract

Cited by 28 (3 self)
 Add to MetaCart
Nonlocal methods for image denoising and inpainting have gained considerable attention in recent years. This is in part due to their superior performance in textured images and regions, a known weakness of purely local methods. Local methods on the other hand have demonstrated to be very appropriate for the recovering of geometric structure such as image edges. The synthesis of both types of methods is a trend in current research. Variational analysis in particular is an appropriate tool for a unified treatment of local and nonlocal methods. In this work we propose a general variational framework for the problem of nonlocal image inpainting, from which several previous inpainting schemes can be derived, in addition to leading to novel ones. We explicitly study some of these, relating them to previous work and showing results on synthetic and real images.
A VARIATIONAL FRAMEWORK FOR EXEMPLARBASED IMAGE INPAINTING
, 2010
"... Nonlocal methods for image denoising and inpainting have gained considerable attention in recent years. This is in part due to their superior performance in textured images, a known weakness of purely local methods. Local methods on the other hand have demonstrated to be very appropriate for the r ..."
Abstract

Cited by 23 (2 self)
 Add to MetaCart
(Show Context)
Nonlocal methods for image denoising and inpainting have gained considerable attention in recent years. This is in part due to their superior performance in textured images, a known weakness of purely local methods. Local methods on the other hand have demonstrated to be very appropriate for the recovering of geometric structures such as image edges. The synthesis of both types of methods is a trend in current research. Variational analysis in particular is an appropriate tool for a unified treatment of local and nonlocal methods. In this work we propose a general variational framework nonlocal image inpainting, from which important and representative previous inpainting schemes can be derived, in addition to leading to novel ones. We explicitly study some of these, relating them to previous work and showing results on synthetic and real images.
Arimoto channel coding converse and Rényi divergence
 In Proceedings of the 48th Annual Allerton Conference on Communication, Control, and Computation
, 2010
"... AbstractArimoto [1] proved a nonasymptotic upper bound on the probability of successful decoding achievable by any code on a given discrete memoryless channel. In this paper we present a simple derivation of the Arimoto converse based on the dataprocessing inequality for Rényi divergence. The met ..."
Abstract

Cited by 17 (2 self)
 Add to MetaCart
(Show Context)
AbstractArimoto [1] proved a nonasymptotic upper bound on the probability of successful decoding achievable by any code on a given discrete memoryless channel. In this paper we present a simple derivation of the Arimoto converse based on the dataprocessing inequality for Rényi divergence. The method has two benefits. First, it generalizes to codes with feedback and gives the simplest proof of the strong converse for the DMC with feedback. Second, it demonstrates that the spherepacking bound is strictly tighter than Arimoto converse for all channels, blocklengths and rates, since in fact we derive the latter from the former. Finally, we prove similar results for other (nonRényi) divergence measures.
A note on a characterization of Rényi measures and its relation to composite hypothesis testing,” arXiv preprint arXiv:1012.4401
, 2010
"... ar ..."
(Show Context)
Evaluating Trace Aggregation Through Entropy Measures for Optimal Performance Visualization of Large Distributed Systems
, 2013
"... ..."
(Show Context)
THERMODYNAMIC SEMIRINGS
, 1108
"... Abstract. Thermodynamic semirings are deformed additive structures on characteristic one semirings, defined using a binary information measure. The algebraic properties of the semiring encode thermodynamical and information theoretic properties of the entropy function. Besides the case of the Shanno ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
(Show Context)
Abstract. Thermodynamic semirings are deformed additive structures on characteristic one semirings, defined using a binary information measure. The algebraic properties of the semiring encode thermodynamical and information theoretic properties of the entropy function. Besides the case of the Shannon entropy, which arises in the context of geometry over the field with one element and the Witt construction in characteristic one, there are other interesting thermodynamic semirings associated to the Rényi and Tsallis entropies, and to the Kullback–Leibler divergence, with connections to information geometry, multifractal analysis, and statistical mechanics. A more general theory of thermodynamic semirings is then formulated in categorical terms, by encoding all partial associativity and commutativity constraints into an entropy operad and a corresponding information algebra. Contents
DIVERGENCE FUNCTION, INFORMATION MONOTONICITY AND INFORMATION GEOMETRY
"... A divergence function measures how different two points are in a base space. Wellknown examples are the KullbackLeibler divergence and fdivergence, which are defined in a manifold of probability distributions. The Bregman divergence is used in a more general situation. The present paper character ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
(Show Context)
A divergence function measures how different two points are in a base space. Wellknown examples are the KullbackLeibler divergence and fdivergence, which are defined in a manifold of probability distributions. The Bregman divergence is used in a more general situation. The present paper characterizes the geometrical structure which a divergence function gives, and proves that the fdivergences are unique in the sense of informationinvariancy, giving the alphageometrical structure. Bregman divergences are characterized by dually flat geometrical structure. The paper also studies geometrical properties of hierarchical models which include singular structure. 1.