Results 1 
9 of
9
Lectures on the central limit theorem for empirical processes
 Probability and Banach Spaces
, 1986
"... Abstract. Concentration inequalities are used to derive some new inequalities for ratiotype suprema of empirical processes. These general inequalities are used to prove several new limit theorems for ratiotype suprema and to recover anumber of the results from [1] and [2]. As a statistical applica ..."
Abstract

Cited by 135 (9 self)
 Add to MetaCart
Abstract. Concentration inequalities are used to derive some new inequalities for ratiotype suprema of empirical processes. These general inequalities are used to prove several new limit theorems for ratiotype suprema and to recover anumber of the results from [1] and [2]. As a statistical application, an oracle inequality for nonparametric regression is obtained via ratio bounds. 1.
Pseudomaximization and selfnormalized processes
, 2007
"... Selfnormalized processes are basic to many probabilistic and statistical studies. They arise naturally in the the study of stochastic integrals, martingale inequalities and limit theorems, likelihoodbased methods in hypothesis testing and parameter estimation, and Studentized pivots and bootstrap ..."
Abstract

Cited by 15 (0 self)
 Add to MetaCart
Selfnormalized processes are basic to many probabilistic and statistical studies. They arise naturally in the the study of stochastic integrals, martingale inequalities and limit theorems, likelihoodbased methods in hypothesis testing and parameter estimation, and Studentized pivots and bootstrapt methods for confidence intervals. In contrast to standard normalization, large values of the observations play a lesser role as they appear both in the numerator and its selfnormalized denominator, thereby making the process scale invariant and contributing to its robustness. Herein we survey a number of results for selfnormalized processes in the case of dependent variables and describe a key method called “pseudomaximization” that has been used to derive these results. In the multivariate case, selfnormalization consists of multiplying by the inverse of a positive definite matrix (instead of dividing by a positive random variable as in the scalar case) and is ubiquitous in statistical applications, examples of which are given.
On Uniform Deviations of General Empirical Risks with Unboundedness
 Dependence, and High Dimensionality,” Journal of Machine Learning Research
, 2009
"... The statistical learning theory of risk minimization depends heavily on probability bounds for uniform deviations of the empirical risks. Classical probability bounds using Hoeffding’s inequality cannot accommodate more general situations with unbounded loss and dependent data. The current paper int ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
The statistical learning theory of risk minimization depends heavily on probability bounds for uniform deviations of the empirical risks. Classical probability bounds using Hoeffding’s inequality cannot accommodate more general situations with unbounded loss and dependent data. The current paper introduces an inequality that extends Hoeffding’s inequality to handle these more general situations. We will apply this inequality to provide probability bounds for uniform deviations in a very general framework, which can involve discrete decision rules, unbounded loss, and a dependence structure that can be more general than either martingale or strong mixing. We will consider two examples with high dimensional predictors: autoregression (AR) with ℓ1loss, and ARX model with variable selection for sign classification, which uses both lagged responses and exogenous predictors.
Exponential Bounds for Multivariate Selfnormalized Sums
 Electronic Communications in Probability
, 2008
"... In a nonparametric framework, we establish some nonasymptotic bounds for selfnormalized sums and quadratic forms in the multivariate case for symmetric and general random variables. This bounds are entirely explicit and essentially depends in the general case on the kurtosis of the Euclidean norm ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
In a nonparametric framework, we establish some nonasymptotic bounds for selfnormalized sums and quadratic forms in the multivariate case for symmetric and general random variables. This bounds are entirely explicit and essentially depends in the general case on the kurtosis of the Euclidean norm of the standardized random variables.
S02460203(02)011251/FLA LIKELIHOOD RATIO INEQUALITIES WITH APPLICATIONS TO VARIOUS MIXTURES
, 2002
"... ABSTRACT. – We give two simple inequalities on likelihood ratios. A first application is the consistency of the maximumpenalized marginallikelihood estimator of the number of populations in a mixture with Markov regime. The second application is the derivation of the asymptotic power of the likeli ..."
Abstract
 Add to MetaCart
(Show Context)
ABSTRACT. – We give two simple inequalities on likelihood ratios. A first application is the consistency of the maximumpenalized marginallikelihood estimator of the number of populations in a mixture with Markov regime. The second application is the derivation of the asymptotic power of the likelihood ratio test under loss of identifiability for contiguous alternatives. Finally, we propose selfnormalized score tests that have exponentially decreasing level and asymptotic power 1. © 2002 Éditions scientifiques et médicales Elsevier SAS RÉSUMÉ. – Nous donnons deux inégalités pour les rapports de vraisemblance. Une première application est la consistance de l’estimateur de vraisemblance marginale pénalisée du nombre de composants dans un mélange de populations à régime markovien. Une deuxième application est le calcul de la puissance asymptotique du test de rapport de vraisemblance dans les cas de non identifiabilité des paramètres sous l’hypothèse nulle, et pour des alternatives contigües. Enfin, nous proposons un test du score autonormalisé dont l’erreur de première espèce décroit exponentiellement vite et la puissance asymptotique est égale à 1. © 2002 Éditions scientifiques et médicales Elsevier SAS 1.
6.1. Stability and Ergodicity of Piecewise Deterministic Markov Processes 7
"... c t i v it y e p o r t 2008 Table of contents ..."
ELECTRONIC COMMUNICATIONS in PROBABILITY EXPONENTIAL BOUNDS FOR MULTIVARIATE SELFNORMALIZED SUMS
, 2008
"... In a nonparametric framework, we establish some nonasymptotic bounds for selfnormalized sums and quadratic forms in the multivariate case for symmetric and general random variables. This bounds are entirely explicit and essentially depends in the general case on the kurtosis of the Euclidean norm ..."
Abstract
 Add to MetaCart
(Show Context)
In a nonparametric framework, we establish some nonasymptotic bounds for selfnormalized sums and quadratic forms in the multivariate case for symmetric and general random variables. This bounds are entirely explicit and essentially depends in the general case on the kurtosis of the Euclidean norm of the standardized random variables. 1
Elect. Comm. in Probab. 11 (2006), 149–159 ELECTRONIC COMMUNICATIONS in PROBABILITY LARGE AND MODERATE DEVIATIONS FOR HOTELLING’S T 2STATISTIC
, 2006
"... large deviation; moderate deviation; selfnormalized partial sums; law of the iterated logarithm; T 2 statistic. Let X,X1,X2,... be i.i.d. R dvalued random variables. We prove large and moderate deviations for Hotelling’s T 2statistic when X is in the generalized domain of attraction of the norma ..."
Abstract
 Add to MetaCart
large deviation; moderate deviation; selfnormalized partial sums; law of the iterated logarithm; T 2 statistic. Let X,X1,X2,... be i.i.d. R dvalued random variables. We prove large and moderate deviations for Hotelling’s T 2statistic when X is in the generalized domain of attraction of the normal law. 1