Results 11  20
of
377
Concentration inequalities
 ADVANCED LECTURES IN MACHINE LEARNING
, 2004
"... Concentration inequalities deal with deviations of functions of independent random variables from their expectation. In the last decade new tools have been introduced making it possible to establish simple and powerful inequalities. These inequalities are at the heart of the mathematical analysis o ..."
Abstract

Cited by 89 (1 self)
 Add to MetaCart
Concentration inequalities deal with deviations of functions of independent random variables from their expectation. In the last decade new tools have been introduced making it possible to establish simple and powerful inequalities. These inequalities are at the heart of the mathematical analysis of various problems in machine learning and made it possible to derive new efficient algorithms. This text attempts to summarize some of the basic tools.
Nonasymptotic theory of random matrices: extreme singular values
 PROCEEDINGS OF THE INTERNATIONAL CONGRESS OF MATHEMATICIANS
, 2010
"... ..."
On the concentration of eigenvalues of random symmetric matrices
 Israel J. Math
, 2000
"... It is shown that for every 1 ≤ s ≤ n, the probability that the sth largest eigenvalue of a random symmetric nbyn matrix with independent random entries of absolute value at most 1 deviates from its median by more than t is at most 4e −t2 /32s 2. The main ingredient in the proof is Talagrand’s Ine ..."
Abstract

Cited by 84 (11 self)
 Add to MetaCart
(Show Context)
It is shown that for every 1 ≤ s ≤ n, the probability that the sth largest eigenvalue of a random symmetric nbyn matrix with independent random entries of absolute value at most 1 deviates from its median by more than t is at most 4e −t2 /32s 2. The main ingredient in the proof is Talagrand’s Inequality for concentration of measure in product spaces. 1
Concentration inequalities using the entropy method
"... We investigate a new methodology, worked out by Ledoux and Massart, to prove concentrationofmeasure inequalities. The method is based on certain modified logarithmic Sobolev inequalities. We provide some very simple and general readytouse inequalities. One of these inequalities may be considered ..."
Abstract

Cited by 67 (3 self)
 Add to MetaCart
We investigate a new methodology, worked out by Ledoux and Massart, to prove concentrationofmeasure inequalities. The method is based on certain modified logarithmic Sobolev inequalities. We provide some very simple and general readytouse inequalities. One of these inequalities may be considered as an exponential version of the EfronStein inequality. The main purpose of this paper is to point out the simplicity and the generality of the approach. We show how the new method can recover many of Talagrand’s revolutionary inequalities and provide new applications in a variety of problems including Rademacher averages, Rademacher chaos, the number of certain small subgraphs in a random graph, and the minimum of the empirical risk in some statistical estimation problems.
Interpolated inequalities between exponential and Gaussian, Orlicz hypercontractivity and isoperimetry
, 2004
"... ..."
(Show Context)
First Passage Percolation Has Sublinear Distance Variance
 Ann. Probab
, 1970
"... Let 0 < a < b < ∞, and for each edge e of Z d let ωe = a or ωe = b, each with probability 1/2, independently. This induces a random metric distω on the vertices of Z d, called first passage percolation. We prove that for d> 1 the distance distω(0,v) from the origin to a vertex v, v > ..."
Abstract

Cited by 55 (7 self)
 Add to MetaCart
(Show Context)
Let 0 < a < b < ∞, and for each edge e of Z d let ωe = a or ωe = b, each with probability 1/2, independently. This induces a random metric distω on the vertices of Z d, called first passage percolation. We prove that for d> 1 the distance distω(0,v) from the origin to a vertex v, v > 2, has variance bounded by C v/log v, where C = C(a,b,d) is a constant which may only depend on a, b and d. Some related variants are also discussed. 1
Competition interfaces and second class particles
, 2005
"... The onedimensional nearestneighbor totally asymmetric simple exclusion process can be constructed in the same space as a lastpassage percolation model in Z 2. We show that the trajectory of a second class particle in the exclusion process can be linearly mapped into the competition interface betwe ..."
Abstract

Cited by 42 (7 self)
 Add to MetaCart
(Show Context)
The onedimensional nearestneighbor totally asymmetric simple exclusion process can be constructed in the same space as a lastpassage percolation model in Z 2. We show that the trajectory of a second class particle in the exclusion process can be linearly mapped into the competition interface between two growing clusters in the lastpassage percolation model. Using technology built up for geodesics in percolation, we show that the competition interface converges almost surely to an asymptotic random direction. As a consequence we get a new proof for the strong law of large numbers for the second class particle in the rarefaction fan and describe the distribution of the asymptotic angle of the competition interface.