Results 1  10
of
66
Local Rademacher complexities
 Annals of Statistics
, 2002
"... We propose new bounds on the error of learning algorithms in terms of a datadependent notion of complexity. The estimates we establish give optimal rates and are based on a local and empirical version of Rademacher averages, in the sense that the Rademacher averages are computed from the data, on a ..."
Abstract

Cited by 174 (21 self)
 Add to MetaCart
(Show Context)
We propose new bounds on the error of learning algorithms in terms of a datadependent notion of complexity. The estimates we establish give optimal rates and are based on a local and empirical version of Rademacher averages, in the sense that the Rademacher averages are computed from the data, on a subset of functions with small empirical error. We present some applications to classification and prediction with convex function classes, and with kernel classes in particular.
Compressive Sensing and Structured Random Matrices
 RADON SERIES COMP. APPL. MATH XX, 1–95 © DE GRUYTER 20YY
"... These notes give a mathematical introduction to compressive sensing focusing on recovery using ℓ1minimization and structured random matrices. An emphasis is put on techniques for proving probabilistic estimates for condition numbers of structured random matrices. Estimates of this type are key to ..."
Abstract

Cited by 162 (19 self)
 Add to MetaCart
These notes give a mathematical introduction to compressive sensing focusing on recovery using ℓ1minimization and structured random matrices. An emphasis is put on techniques for proving probabilistic estimates for condition numbers of structured random matrices. Estimates of this type are key to providing conditions that ensure exact or approximate recovery of sparse vectors using ℓ1minimization.
Restricted isometries for partial random circulant matrices
 Appl. Comput. Harmon. Anal
"... In the theory of compressed sensing, restricted isometry analysis has become a standard tool for studying how efficiently a measurement matrix acquires information about sparse and compressible signals. Many recovery algorithms are known to succeed when the restricted isometry constants of the sampl ..."
Abstract

Cited by 48 (9 self)
 Add to MetaCart
(Show Context)
In the theory of compressed sensing, restricted isometry analysis has become a standard tool for studying how efficiently a measurement matrix acquires information about sparse and compressible signals. Many recovery algorithms are known to succeed when the restricted isometry constants of the sampling matrix are small. Many potential applications of compressed sensing involve a dataacquisition process that proceeds by convolution with a random pulse followed by (nonrandom) subsampling. At present, the theoretical analysis of this measurement technique is lacking. This paper demonstrates that the sth order restricted isometry constant is small when the number m of samples satisfies m � (s log n) 3/2, where n is the length of the pulse. This bound improves on previous estimates, which exhibit quadratic scaling. 1
Generalization Error Bounds for Bayesian Mixture Algorithms
 Journal of Machine Learning Research
, 2003
"... Bayesian approaches to learning and estimation have played a significant role in the Statistics literature over many years. While they are often provably optimal in a frequentist setting, and lead to excellent performance in practical applications, there have not been many precise characterizations ..."
Abstract

Cited by 38 (6 self)
 Add to MetaCart
(Show Context)
Bayesian approaches to learning and estimation have played a significant role in the Statistics literature over many years. While they are often provably optimal in a frequentist setting, and lead to excellent performance in practical applications, there have not been many precise characterizations of their performance for finite sample sizes under general conditions. In this paper we consider the class of Bayesian mixture algorithms, where an estimator is formed by constructing a datadependent mixture over some hypothesis space. Similarly to what is observed in practice, our results demonstrate that mixture approaches are particularly robust, and allow for the construction of highly complex estimators, while avoiding undesirable overfitting effects. Our results, while being datadependent in nature, are insensitive to the underlying model assumptions, and apply whether or not these hold. At a technical level, the approach applies to unbounded functions, constrained only by certain moment conditions. Finally, the bounds derived can be directly applied to nonBayesian mixture approaches such as Boosting and Bagging. 1.
Concentration inequalities for dependent random variables via the martingale method
 ANNALS OF PROBABILITY
, 2008
"... The martingale method is used to establish concentration inequalities for a class of dependent random sequences on a countable state space, with the constants in the inequalities expressed in terms of certain mixing coefficients. Along the way, bounds are obtained on martingale differences associate ..."
Abstract

Cited by 35 (5 self)
 Add to MetaCart
(Show Context)
The martingale method is used to establish concentration inequalities for a class of dependent random sequences on a countable state space, with the constants in the inequalities expressed in terms of certain mixing coefficients. Along the way, bounds are obtained on martingale differences associated with the random sequences, which may be of independent interest. As applications of the main result, concentration inequalities are also derived for inhomogeneous Markov chains and hidden Markov chains, and an extremal property associated with their martingale difference bounds is established. This work complements and generalizes certain concentration inequalities obtained by Marton and Samson, while also providing different proofs of some known results.
Exponential concentration for First Passage Percolation through modified Poincaré Inequalities
, 2006
"... We provide a new exponential concentration inequality for First Passage Percolation valid for a wide class of edge times distributions. This improves and extends a result by Benjamini, Kalai and Schramm [5] which gave a variance bound for Bernoulli edge times. Our approach is based on some function ..."
Abstract

Cited by 33 (1 self)
 Add to MetaCart
We provide a new exponential concentration inequality for First Passage Percolation valid for a wide class of edge times distributions. This improves and extends a result by Benjamini, Kalai and Schramm [5] which gave a variance bound for Bernoulli edge times. Our approach is based on some functional inequalities extending the work of Rossignol [20], Falik and Samorodnitsky [9]. Résumé: On obtient une nouvelle inégalité de concentration exponentielle pour la percolation de premier passage, valable pour une large classe de distributions des temps d’arêtes. Ceci améliore et étend un résultat de Benjamini, Kalai et Schramm [5] qui donnait une borne sur la variance pour des temps d’arêtes suivant une loi de Bernoulli. Notre approche se fonde sur des inégalités fonctionnelles étendant les travaux de Rossignol [20], Falik et Samorodnitsky [9].
Suprema of chaos processes and the restricted isometry property
 Comm. Pure Appl. Math
"... We present a new bound for suprema of a special type of chaos processes indexed by a set of matrices, which is based on a chaining method. As applications we show significantly improved estimates for the restricted isometry constants of partial random circulant matrices and timefrequency structured ..."
Abstract

Cited by 32 (7 self)
 Add to MetaCart
We present a new bound for suprema of a special type of chaos processes indexed by a set of matrices, which is based on a chaining method. As applications we show significantly improved estimates for the restricted isometry constants of partial random circulant matrices and timefrequency structured random matrices. In both cases the required condition on the number m of rows in terms of the sparsity s and the vector length n is m � s log 2 s log 2 n. Key words. Compressive sensing, restricted isometry property, structured random matrices, chaos processes, γ2functionals, generic chaining, partial random circulant matrices, random Gabor synthesis matrices.
Complexity regularization via localized random penalties
, 2004
"... In this article, model selection via penalized empirical loss minimization in nonparametric classification problems is studied. Datadependent penalties are constructed, which are based on estimates of the complexity of a small subclass of each model class, containing only those functions with small ..."
Abstract

Cited by 31 (4 self)
 Add to MetaCart
(Show Context)
In this article, model selection via penalized empirical loss minimization in nonparametric classification problems is studied. Datadependent penalties are constructed, which are based on estimates of the complexity of a small subclass of each model class, containing only those functions with small empirical loss. The penalties are novel since those considered in the literature are typically based on the entire model class. Oracle inequalities using these penalties are established, and the advantage of the new penalties over those based on the complexity of the whole model class is demonstrated.
Stein’s method for concentration inequalities
 Prob. Th. Rel. Fields
, 2007
"... Abstract. We introduce a version of Stein’s method for proving concentration and moment inequalities in problems with dependence. Simple illustrative examples from combinatorics, physics, and mathematical statistics are provided. 1. Introduction and ..."
Abstract

Cited by 22 (5 self)
 Add to MetaCart
(Show Context)
Abstract. We introduce a version of Stein’s method for proving concentration and moment inequalities in problems with dependence. Simple illustrative examples from combinatorics, physics, and mathematical statistics are provided. 1. Introduction and
Some local measures of complexity of convex hulls and generalization bounds
 Proceedings of the 15th Annual Conference on Computational Learning Theory
, 2002
"... Abstract. We investigate measures of complexity of function classes based on continuity moduli of Gaussian and Rademacher processes. For Gaussian processes, we obtain bounds on the continuity modulus on the convex hull of a function class in terms of the same quantity for the class itself. We also o ..."
Abstract

Cited by 20 (6 self)
 Add to MetaCart
(Show Context)
Abstract. We investigate measures of complexity of function classes based on continuity moduli of Gaussian and Rademacher processes. For Gaussian processes, we obtain bounds on the continuity modulus on the convex hull of a function class in terms of the same quantity for the class itself. We also obtain new bounds on generalization error in terms of localized Rademacher complexities. This allows us to prove new results about generalization performance for convex hulls in terms of characteristics of the base class. As a byproduct, we obtain a simple proof of some of the known bounds on the entropy of convex hulls.