Results 1 
5 of
5
Minimax Learning Rates for Bipartite Ranking and Plugin Rules
"... While it is now wellknown in the standard binary classification setup, that, under suitable margin assumptions and complexity conditions on the regression function, fast or even superfast rates (i.e. rates faster than n −1/2 or even faster than n −1) can be achieved by plugin classifiers, no resu ..."
Abstract

Cited by 4 (2 self)
 Add to MetaCart
While it is now wellknown in the standard binary classification setup, that, under suitable margin assumptions and complexity conditions on the regression function, fast or even superfast rates (i.e. rates faster than n −1/2 or even faster than n −1) can be achieved by plugin classifiers, no result of this nature has been proved yet in the context of bipartite ranking, though akin to that of classification. It is the main purpose of the present paper to investigate this issue, by considering bipartite ranking as a nested continuous collection of costsensitive classification problems. A global low noise condition is exhibited under which certain (plugin) ranking rules are proved to achieve fast (but not superfast) rates over a wide nonparametric class of models. A lower bound result is also stated in a specific situation, establishing that such rates are optimal from a minimax perspective. 1.
Machine Learning manuscript No.
, 2009
"... (will be inserted by the editor) Adaptive partitioning schemes for bipartite ranking How to grow and prune a ranking tree ..."
Abstract
 Add to MetaCart
(Show Context)
(will be inserted by the editor) Adaptive partitioning schemes for bipartite ranking How to grow and prune a ranking tree
Observations Z1,...,Zn are IID
, 2009
"... Empirical risk minimization with statistics of higher order ..."
Ranking MultiClass Data: Optimality and Pairwise Aggregation
, 2011
"... Abstract It is the primary purpose of this paper to set the goals of ranking in a multipleclass context rigorously, following in the footsteps of recent results in the bipartite framework. Under specific likelihood ratio monotonicity conditions, optimal solutions for this global learning problem ar ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract It is the primary purpose of this paper to set the goals of ranking in a multipleclass context rigorously, following in the footsteps of recent results in the bipartite framework. Under specific likelihood ratio monotonicity conditions, optimal solutions for this global learning problem are described in the ordinal situation, i.e. when there exists a natural order on the set of labels. Criteria reflecting ranking performance under these conditions such as the ROC surface and its natural summary, the volume under the ROC surface (VUS), are next considered as targets for empirical optimization. Whereas plugin techniques or the Empirical Risk Maximization principle can be then easily extended to the ordinal multiclass setting, reducing the Kpartite ranking task to the solving of a collection of bipartite ranking problems, following in the footsteps of the pairwise comparison approach in classification, is in contrast more challenging. Here we consider a concept of ranking rule consensus based on the Kendall τ distance and show that, when it exists and is based on consistent ranking rules for the bipartite ranking subproblems defined by all consecutive pairs of labels, the latter forms a consistent ranking rule in the VUS sense under adequate conditions. This result paves the way for extending the use of recently developed learning algorithms, tailored for bipartite ranking, to multiclass data in a valid theoretical framework. Preliminary experimental results are presented for illustration purpose. 1
Minimax Learning Rates for Bipartite Ranking and Plugin Rules
, 2011
"... All intext references underlined in blue are linked to publications on ResearchGate, letting you access and read them immediately. ..."
Abstract
 Add to MetaCart
All intext references underlined in blue are linked to publications on ResearchGate, letting you access and read them immediately.