Results 1 - 10
of
19,868
The group Lasso for logistic regression
- Journal of the Royal Statistical Society, Series B
, 2008
"... Summary. The group lasso is an extension of the lasso to do variable selection on (predefined) groups of variables in linear regression models. The estimates have the attractive property of being invariant under groupwise orthogonal reparameterizations. We extend the group lasso to logistic regressi ..."
Abstract
-
Cited by 276 (11 self)
- Add to MetaCart
Summary. The group lasso is an extension of the lasso to do variable selection on (predefined) groups of variables in linear regression models. The estimates have the attractive property of being invariant under groupwise orthogonal reparameterizations. We extend the group lasso to logistic
Logistic regression
"... This text offers an introduction to binary logistic regression, a confirmatory technique for statistically modelling the effect of one or several predictors on a binary response variable. It is explained why logistic regression is exceptionally well suited for the comparison of near synonyms in corp ..."
Abstract
- Add to MetaCart
This text offers an introduction to binary logistic regression, a confirmatory technique for statistically modelling the effect of one or several predictors on a binary response variable. It is explained why logistic regression is exceptionally well suited for the comparison of near synonyms
Additive Logistic Regression: a Statistical View of Boosting
- Annals of Statistics
, 1998
"... Boosting (Freund & Schapire 1996, Schapire & Singer 1998) is one of the most important recent developments in classification methodology. The performance of many classification algorithms can often be dramatically improved by sequentially applying them to reweighted versions of the input dat ..."
Abstract
-
Cited by 1750 (25 self)
- Add to MetaCart
be viewed as an approximation to additive modeling on the logistic scale using maximum Bernoulli likelihood as a criterion. We develop more direct approximations and show that they exhibit nearly identical results to boosting. Direct multi-class generalizations based on multinomial likelihood are derived
On Discriminative vs. Generative classifiers: A comparison of logistic regression and naive Bayes
, 2001
"... We compare discriminative and generative learning as typified by logistic regression and naive Bayes. We show, contrary to a widely held belief that discriminative classifiers are almost always to be preferred, that there can often be two distinct regimes of performance as the training set size is i ..."
Abstract
-
Cited by 520 (8 self)
- Add to MetaCart
We compare discriminative and generative learning as typified by logistic regression and naive Bayes. We show, contrary to a widely held belief that discriminative classifiers are almost always to be preferred, that there can often be two distinct regimes of performance as the training set size
Maximum-Margin Logistic Regression
, 2005
"... The Regularized Logistic Regression (RLR) minimization objective is ..."
Abstract
-
Cited by 1 (0 self)
- Add to MetaCart
The Regularized Logistic Regression (RLR) minimization objective is
logistic regression
, 2007
"... Abstract: Inference based on the penalized density ratio model is proposed and studied. The model under consideration is specified by assuming that the log–likelihood function of two unknown densities is of some parametric form. The model has been extended to cover multiple samples problems while it ..."
Abstract
- Add to MetaCart
Abstract: Inference based on the penalized density ratio model is proposed and studied. The model under consideration is specified by assuming that the log–likelihood function of two unknown densities is of some parametric form. The model has been extended to cover multiple samples problems while its theoretical properties have been investigated using large sample theory. A main application of the density ratio model is testing whether two, or more, distributions are equal. We extend these results by arguing that the penalized maximum empirical likelihood estimator has less mean square error than that of the ordinary maximum likelihood estimator, especially for small samples. In fact, penalization resolves any existence problems of estimators and a modified Wald type test statistic can be employed for testing equality of the two distributions. A limited simulation study supports further the theory.
Privacy-preserving logistic regression
"... This paper addresses the important tradeoff between privacy and learnability, when designing algorithms for learning from private databases. We focus on privacy-preserving logistic regression. First we apply an idea of Dwork et al. [7] to design a privacy-preserving logistic regression algorithm. Th ..."
Abstract
-
Cited by 58 (2 self)
- Add to MetaCart
This paper addresses the important tradeoff between privacy and learnability, when designing algorithms for learning from private databases. We focus on privacy-preserving logistic regression. First we apply an idea of Dwork et al. [7] to design a privacy-preserving logistic regression algorithm
Results 1 - 10
of
19,868