Results 11  20
of
3,352
and tested: Regular Batch Gradient Descent Algorithm, Regularized Gradient Descent Algorithm and
, 2008
"... The goal of the spam filtering problem is to identify an email as a spam or not spam. One of the classic techniques used in spam filtering is to predict using logistic regression. Words that frequently occur in a spam email are used as the feature set in the regression problem. This in this report, ..."
Abstract
 Add to MetaCart
The goal of the spam filtering problem is to identify an email as a spam or not spam. One of the classic techniques used in spam filtering is to predict using logistic regression. Words that frequently occur in a spam email are used as the feature set in the regression problem. This in this report, we examine some of the different techniques used for minimizing the logistic loss function and provide a performance analysis of the differnt techniques. Specifically three diffrent types of minimization techniques were implemented
The IsoRegularization Descent Algorithm for the LASSO
, 2010
"... Abstract. Following the introduction by Tibshirani of the LASSO technique for feature selection in regression, two algorithms were proposed by Osborne et al. for solving the associated problem. One is an homotopy method that gained popularity as the LASSO modification of the LARS algorithm. The othe ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Abstract. Following the introduction by Tibshirani of the LASSO technique for feature selection in regression, two algorithms were proposed by Osborne et al. for solving the associated problem. One is an homotopy method that gained popularity as the LASSO modification of the LARS algorithm
Principal components: a descent algorithm
, 2012
"... A descent procedure is proposed for the search of lowdimensional subspaces of a highdimensional space that satisfy an optimality criterion. Specifically, the procedure is applied to finding the subspace spanned by the first m singular components of an ndimensional dataset. The procedure minimizes ..."
Abstract
 Add to MetaCart
A descent procedure is proposed for the search of lowdimensional subspaces of a highdimensional space that satisfy an optimality criterion. Specifically, the procedure is applied to finding the subspace spanned by the first m singular components of an ndimensional dataset. The procedure
An Asynchronous Parallel Stochastic Coordinate Descent Algorithm
 JOURNAL OF MACHINE LEARNING RESEARCH
"... We describe an asynchronous parallel stochastic coordinate descent algorithm for minimizing smooth unconstrained or separably constrained functions. The method achieves a linear convergence rate on functions that satisfy an essential strong convexity property and a sublinear rate (1/K) on general c ..."
Abstract

Cited by 27 (6 self)
 Add to MetaCart
We describe an asynchronous parallel stochastic coordinate descent algorithm for minimizing smooth unconstrained or separably constrained functions. The method achieves a linear convergence rate on functions that satisfy an essential strong convexity property and a sublinear rate (1/K) on general
Recursive Aggregation of Estimators by the Mirror Descent Algorithm with Averaging
, 2008
"... We consider a recursive algorithm to construct an aggregated estimator from a finite number of base decision rules in the classification problem. The estimator approximately minimizes a convex risk functional under the ℓ 1constraint. It is defined by a stochastic version of the mirror descent algor ..."
Abstract

Cited by 23 (5 self)
 Add to MetaCart
We consider a recursive algorithm to construct an aggregated estimator from a finite number of base decision rules in the classification problem. The estimator approximately minimizes a convex risk functional under the ℓ 1constraint. It is defined by a stochastic version of the mirror descent
A new feasible descent algorithm combining SQP with . . .
, 2005
"... ... is presented. At each iteration of the new algorithm, a convex quadratic program (QP) is solved and a master direction is obtained, and an improved (feasible descent) direction is descent algorithm; Superlinear convergence Applied Mathematics and Computation 162 (2005) 1065â1081 www.elsevier.c ..."
Abstract
 Add to MetaCart
... is presented. At each iteration of the new algorithm, a convex quadratic program (QP) is solved and a master direction is obtained, and an improved (feasible descent) direction is descent algorithm; Superlinear convergence Applied Mathematics and Computation 162 (2005) 1065â1081 www
Regularization paths for generalized linear models via coordinate descent
, 2009
"... We develop fast algorithms for estimation of generalized linear models with convex penalties. The models include linear regression, twoclass logistic regression, and multinomial regression problems while the penalties include ℓ1 (the lasso), ℓ2 (ridge regression) and mixtures of the two (the elastic ..."
Abstract

Cited by 724 (15 self)
 Add to MetaCart
elastic net). The algorithms use cyclical coordinate descent, computed along a regularization path. The methods can handle large problems and can also deal efficiently with sparse features. In comparative timings we find that the new algorithms are considerably faster than competing methods.
Stochastic Gradient Descent Algorithm in the Computational Network Toolkit
"... We introduce the stochastic gradient descent algorithm used in the computational network toolkit (CNTK) — a general purpose machine learning toolkit written in C++ for training and using models that can be expressed as a computational network. We describe the algorithm used to compute the gradients ..."
Abstract
 Add to MetaCart
We introduce the stochastic gradient descent algorithm used in the computational network toolkit (CNTK) — a general purpose machine learning toolkit written in C++ for training and using models that can be expressed as a computational network. We describe the algorithm used to compute
MUTIPLEGRADIENT DESCENT ALGORITHM FOR MULTIOBJECTIVE OPTIMIZATION
, 2012
"... Abstract. The steepestdescent method is a wellknown and effective singleobjective descent algorithm when the gradient of the objective function is known. Here, we propose a particular generalization of this method to multiobjective optimization by considering the concurrent minimization of n smo ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Abstract. The steepestdescent method is a wellknown and effective singleobjective descent algorithm when the gradient of the objective function is known. Here, we propose a particular generalization of this method to multiobjective optimization by considering the concurrent minimization of n
Results 11  20
of
3,352