Results 1  10
of
376
On the geometry of metric measure spaces
 II, ACTA MATH
, 2004
"... We introduce and analyze lower (’Ricci’) curvature bounds Curv(M, d,m) ≥ K for metric measure spaces (M, d,m). Our definition is based on convexity properties of the relative entropy Ent(.m) regarded as a function on the L2Wasserstein space of probability measures on the metric space (M, d). Amo ..."
Abstract

Cited by 247 (9 self)
 Add to MetaCart
(Show Context)
We introduce and analyze lower (’Ricci’) curvature bounds Curv(M, d,m) ≥ K for metric measure spaces (M, d,m). Our definition is based on convexity properties of the relative entropy Ent(.m) regarded as a function on the L2Wasserstein space of probability measures on the metric space (M, d). Among others, we show that Curv(M, d,m) ≥ K implies estimates for the volume growth of concentric balls. For Riemannian manifolds, Curv(M, d,m) ≥ K if and only if RicM (ξ, ξ) ≥ K · ξ2 for all ξ ∈ TM. The crucial point is that our lower curvature bounds are stable under an appropriate notion of Dconvergence of metric measure spaces. We define a complete and separable metric D on the family of all isomorphism classes of normalized metric measure spaces. The metric D has a natural interpretation, based on the concept of optimal mass transportation. We also prove that the family of normalized metric measure spaces with doubling constant ≤ C is closed under Dconvergence. Moreover, the family of normalized metric measure spaces with doubling constant ≤ C and radius ≤ R is compact under Dconvergence.
Exponential integrability and transportation cost related to logarithmic Sobolev inequalities
 J. FUNCT. ANAL
, 1999
"... We study some problems on exponential integrability, concentration of measure, and transportation cost related to logarithmic Sobolev inequalities. On the real line, we then give a characterization of those probability measures which satisfy these inequalities. ..."
Abstract

Cited by 164 (9 self)
 Add to MetaCart
(Show Context)
We study some problems on exponential integrability, concentration of measure, and transportation cost related to logarithmic Sobolev inequalities. On the real line, we then give a characterization of those probability measures which satisfy these inequalities.
The objective method: Probabilistic combinatorial optimization and local weak convergence
, 2003
"... ..."
(Show Context)
On Talagrand's Deviation Inequalities For Product Measures
, 1996
"... We present a new and simple approach to some of the deviation inequalities for product measures deeply investigated by M. ..."
Abstract

Cited by 112 (0 self)
 Add to MetaCart
We present a new and simple approach to some of the deviation inequalities for product measures deeply investigated by M.
A new look at independence
"... The concentration of measure phenomenon in product spaces is a farreaching abstract generalization of the classical exponential inequalities for sums of independent random variables. We attempt to explain in the simplest possible terms the basic concepts underlying this phenomenon, the basic method ..."
Abstract

Cited by 111 (0 self)
 Add to MetaCart
The concentration of measure phenomenon in product spaces is a farreaching abstract generalization of the classical exponential inequalities for sums of independent random variables. We attempt to explain in the simplest possible terms the basic concepts underlying this phenomenon, the basic method to prove concentration inequalities, and the meaning of several of the most useful inequalities.
Concentration of the Spectral Measure for Large Matrices
 ELECTRONIC COMMUNICATIONS IN PROBABILITY
, 2000
"... We derive concentration inequalities for functions of the empirical measure of eigenvalues for large, random, self adjoint matrices, with not necessarily Gaussian entries. The results presented apply in particular to nonGaussian Wigner and Wishart matrices. We also provide concentration bounds fo ..."
Abstract

Cited by 101 (13 self)
 Add to MetaCart
We derive concentration inequalities for functions of the empirical measure of eigenvalues for large, random, self adjoint matrices, with not necessarily Gaussian entries. The results presented apply in particular to nonGaussian Wigner and Wishart matrices. We also provide concentration bounds for noncommutative functionals of random matrices.
Noise sensitivity of Boolean functions and applications to percolation
, 2008
"... It is shown that a large class of events in a product probability space are highly sensitive to noise, in the sense that with high probability, the configuration with an arbitrary small percent of random errors gives almost no prediction whether the event occurs. On the other hand, weighted majority ..."
Abstract

Cited by 101 (18 self)
 Add to MetaCart
(Show Context)
It is shown that a large class of events in a product probability space are highly sensitive to noise, in the sense that with high probability, the configuration with an arbitrary small percent of random errors gives almost no prediction whether the event occurs. On the other hand, weighted majority functions are shown to be noisestable. Several necessary and sufficient conditions for noise sensitivity and stability are given. Consider, for example, bond percolation on an n + 1 by n grid. A configuration is a function that assigns to every edge the value 0 or 1. Let ω be a random configuration, selected according to the uniform measure. A crossing is a path that joins the left and right sides of the rectangle, and consists entirely of edges e with ω(e) = 1. By duality, the probability for having a crossing is 1/2. Fix an ǫ ∈ (0,1). For each edge e, let ω ′ (e) = ω(e) with probability 1 − ǫ, and ω ′ (e) = 1 − ω(e)
Theory of classification: A survey of some recent advances
, 2005
"... The last few years have witnessed important new developments in the theory and practice of pattern classification. We intend to survey some of the main new ideas that have led to these recent results. ..."
Abstract

Cited by 96 (3 self)
 Add to MetaCart
The last few years have witnessed important new developments in the theory and practice of pattern classification. We intend to survey some of the main new ideas that have led to these recent results.