Results 1  10
of
29
Local Rademacher complexities
 Annals of Statistics
, 2002
"... We propose new bounds on the error of learning algorithms in terms of a datadependent notion of complexity. The estimates we establish give optimal rates and are based on a local and empirical version of Rademacher averages, in the sense that the Rademacher averages are computed from the data, on a ..."
Abstract

Cited by 161 (21 self)
 Add to MetaCart
(Show Context)
We propose new bounds on the error of learning algorithms in terms of a datadependent notion of complexity. The estimates we establish give optimal rates and are based on a local and empirical version of Rademacher averages, in the sense that the Rademacher averages are computed from the data, on a subset of functions with small empirical error. We present some applications to classification and prediction with convex function classes, and with kernel classes in particular.
Compressive Sensing and Structured Random Matrices
 RADON SERIES COMP. APPL. MATH XX, 1–95 © DE GRUYTER 20YY
, 2011
"... These notes give a mathematical introduction to compressive sensing focusing on recovery using ℓ1minimization and structured random matrices. An emphasis is put on techniques for proving probabilistic estimates for condition numbers of structured random matrices. Estimates of this type are key to ..."
Abstract

Cited by 156 (18 self)
 Add to MetaCart
These notes give a mathematical introduction to compressive sensing focusing on recovery using ℓ1minimization and structured random matrices. An emphasis is put on techniques for proving probabilistic estimates for condition numbers of structured random matrices. Estimates of this type are key to providing conditions that ensure exact or approximate recovery of sparse vectors using ℓ1minimization.
Lectures on the central limit theorem for empirical processes
 Probability and Banach Spaces
, 1986
"... Abstract. Concentration inequalities are used to derive some new inequalities for ratiotype suprema of empirical processes. These general inequalities are used to prove several new limit theorems for ratiotype suprema and to recover anumber of the results from [1] and [2]. As a statistical applica ..."
Abstract

Cited by 135 (9 self)
 Add to MetaCart
(Show Context)
Abstract. Concentration inequalities are used to derive some new inequalities for ratiotype suprema of empirical processes. These general inequalities are used to prove several new limit theorems for ratiotype suprema and to recover anumber of the results from [1] and [2]. As a statistical application, an oracle inequality for nonparametric regression is obtained via ratio bounds. 1.
Theory of classification: A survey of some recent advances
, 2005
"... The last few years have witnessed important new developments in the theory and practice of pattern classification. We intend to survey some of the main new ideas that have led to these recent results. ..."
Abstract

Cited by 91 (3 self)
 Add to MetaCart
The last few years have witnessed important new developments in the theory and practice of pattern classification. We intend to survey some of the main new ideas that have led to these recent results.
Concentration inequalities
 ADVANCED LECTURES IN MACHINE LEARNING
, 2004
"... Concentration inequalities deal with deviations of functions of independent random variables from their expectation. In the last decade new tools have been introduced making it possible to establish simple and powerful inequalities. These inequalities are at the heart of the mathematical analysis o ..."
Abstract

Cited by 85 (1 self)
 Add to MetaCart
Concentration inequalities deal with deviations of functions of independent random variables from their expectation. In the last decade new tools have been introduced making it possible to establish simple and powerful inequalities. These inequalities are at the heart of the mathematical analysis of various problems in machine learning and made it possible to derive new efficient algorithms. This text attempts to summarize some of the basic tools.
Concentration around the mean for maxima of empirical processes
, 2005
"... In this paper we give optimal constants in Talagrand’s concentration inequalities for maxima of empirical processes associated to independent and eventually nonidentically distributed random variables. Our approach is based on the entropy method introduced by Ledoux. ..."
Abstract

Cited by 58 (1 self)
 Add to MetaCart
In this paper we give optimal constants in Talagrand’s concentration inequalities for maxima of empirical processes associated to independent and eventually nonidentically distributed random variables. Our approach is based on the entropy method introduced by Ledoux.
Concentration inequalities and asymptotic results for ratio type empirical processes
 ANN. PROBAB
, 2006
"... Let F be a class of measurable functions on a measurable space (S, S) with values in [0, 1] and let Pn = n −1 n ∑ δXi i=1 be the empirical measure based on an i.i.d. sample (X1,...,Xn) from a probability distribution P on (S, S). We study the behavior of suprema of the following type: sup rn<σP f ..."
Abstract

Cited by 40 (5 self)
 Add to MetaCart
(Show Context)
Let F be a class of measurable functions on a measurable space (S, S) with values in [0, 1] and let Pn = n −1 n ∑ δXi i=1 be the empirical measure based on an i.i.d. sample (X1,...,Xn) from a probability distribution P on (S, S). We study the behavior of suprema of the following type: sup rn<σP f ≤δn Pnf − Pf  φ(σPf) where σP f ≥ Var 1/2 P f and φ is a continuous, strictly increasing function with φ(0) = 0. Using Talagrand’s concentration inequality for empirical processes, we establish concentration inequalities for such suprema and use them to derive several results about their asymptotic behavior, expressing the conditions in terms of expectations of localized suprema of empirical processes. We also prove new bounds for expected values of supnorms of empirical processes in terms of the largest σP f and the L2(P) norm of the envelope of the function class, which are especially suited for estimating localized suprema. With this technique, we extend to function classes most of the known results on ratio type suprema of empirical processes, including some of Alexander’s results for VC classes of sets. We also consider applications of these results to several important problems in nonparametric statistics and in learning theory (including general excess risk bounds in empirical risk minimization and their versions for L2regression and classification and ratio type bounds for margin distributions in classification).
Concentration inequalities for functions of independent variables
 Random Structures and Algorithms
"... Following the entropy method this paper presents general concentration inequalities, which can be applied to combinatorial optimization and empirical processes. The inequalities give improved concentration results for optimal travelling salesmen tours, Steiner trees and the eigenvalues of random sy ..."
Abstract

Cited by 19 (8 self)
 Add to MetaCart
Following the entropy method this paper presents general concentration inequalities, which can be applied to combinatorial optimization and empirical processes. The inequalities give improved concentration results for optimal travelling salesmen tours, Steiner trees and the eigenvalues of random symmetric matrices. 1
Rates of contraction for posterior distributions in IN LRMETRICS, 1≤ R ≤∞
 THE ANNALS OF STATISTICS
, 2011
"... ..."
Global Uniform Risk Bounds for Wavelet Deconvolution Estimators
 THE ANNALS OF STATISTICS
, 2011
"... We consider the statistical deconvolution problem where one observes n replications from the model Y = X + , where X is the unobserved random signal of interest and is an independent random error with distribution ϕ. Under weak assumptions on the decay of the Fourier transform of ϕ, we derive upper ..."
Abstract

Cited by 12 (5 self)
 Add to MetaCart
We consider the statistical deconvolution problem where one observes n replications from the model Y = X + , where X is the unobserved random signal of interest and is an independent random error with distribution ϕ. Under weak assumptions on the decay of the Fourier transform of ϕ, we derive upper bounds for the finitesample supnorm risk of wavelet deconvolution density estimators fn for the density f of X, where f:R → R is assumed to be bounded. We then derive lower bounds for the minimax supnorm risk over Besov balls in this estimation problem and show that wavelet deconvolution density estimators attain these bounds. We further show that linear estimators adapt to the unknown smoothness of f if the Fourier transform of ϕ decays exponentially and that a corresponding result holds true for the hard thresholding wavelet estimator if ϕ decays polynomially. We also analyze the case where f is a “supersmooth”/analytic density. We finally show how our results and recent techniques from Rademacher processes can be applied to construct global confidence bands for the density f.