Results 1  10
of
91
Efficient and Practical Stochastic Subgradient Descent for Nuclear Norm Regularization
"... We describe novel subgradient methods for a broad class of matrix optimization problems involving nuclear norm regularization. Unlike existing approaches, our method executes very cheap iterations by combining lowrank stochastic subgradients with efficient incremental SVD updates, made possible by ..."
Abstract

Cited by 10 (0 self)
 Add to MetaCart
We describe novel subgradient methods for a broad class of matrix optimization problems involving nuclear norm regularization. Unlike existing approaches, our method executes very cheap iterations by combining lowrank stochastic subgradients with efficient incremental SVD updates, made possible
Pegasos: Primal Estimated subgradient solver for SVM
"... We describe and analyze a simple and effective stochastic subgradient descent algorithm for solving the optimization problem cast by Support Vector Machines (SVM). We prove that the number of iterations required to obtain a solution of accuracy ɛ is Õ(1/ɛ), where each iteration operates on a singl ..."
Abstract

Cited by 542 (20 self)
 Add to MetaCart
We describe and analyze a simple and effective stochastic subgradient descent algorithm for solving the optimization problem cast by Support Vector Machines (SVM). We prove that the number of iterations required to obtain a solution of accuracy ɛ is Õ(1/ɛ), where each iteration operates on a
Incremental Subgradient Methods For Nondifferentiable Optimization
, 2001
"... We consider a class of subgradient methods for minimizing a convex function that consists of the sum of a large number of component functions. This type of minimization arises in a dual context from Lagrangian relaxation of the coupling constraints of large scale separable problems. The idea is to p ..."
Abstract

Cited by 124 (10 self)
 Add to MetaCart
squares problems, such as those arising in the training of neural networks, and it has resulted in a much better practical rate of convergence than the steepest descent method. In this paper, we establish the convergence properties of a number of variants of incremental subgradient methods, including some
Stochastic Approximation Approach to Stochastic Programming
"... In this paper we consider optimization problems where the objective function is given in a form of the expectation. A basic difficulty of solving such stochastic optimization problems is that the involved multidimensional integrals (expectations) cannot be computed with high accuracy. The aim of th ..."
Abstract

Cited by 267 (20 self)
 Add to MetaCart
use a specific (say linear) structure of the considered problem, while the SA approach is a crude subgradient method which often performs poorly in practice. We intend to demonstrate that a properly modified SA approach can be competitive and even significantly outperform the SAA method for a certain
On stochastic subgradient mirrordescent algorithm with weighted averaging
 SIAM J. OPTIM
, 2014
"... ..."
Primaldual subgradient methods for convex problems
, 2005
"... (after revision) In this paper we present a new approach for constructing subgradient schemes for different types of nonsmooth problems with convex structure. Our methods are primaldual since they are always able to generate a feasible approximation to the optimum of an appropriately formulated dual ..."
Abstract

Cited by 143 (3 self)
 Add to MetaCart
the variants of subgradient schemes for nonsmooth convex minimization, minimax problems, saddle point problems, variational inequalities, and stochastic optimization. In all situations our methods are proved to be optimal from the view point of worstcase blackbox lower complexity bounds.
Convergence Rate of Incremental Subgradient Algorithms
 Stochastic Optimization: Algorithms and Applications
, 2000
"... We consider a class of subgradient methods for minimizing a convex function that consists of the sum of a large number of component functions. This type of minimization arises in a dual context from Lagrangian relaxation of the coupling constraints of large scale separable problems. The idea is to p ..."
Abstract

Cited by 65 (6 self)
 Add to MetaCart
squares problems, such as those arising in the training of neural networks, and it has resulted in a much better practical rate of convergence than the steepest descent method. In this paper, we present convergence results and estimates of the convergence rate of a number of variants of incremental
Stochastic Coordinate Descent for Nonsmooth Convex Optimization
"... Stochastic coordinate descent, due to its practicality and efficiency, is increasingly popular in machine learning and signal processing communities as it has proven successful in several largescale optimization problems, such as l1 regularized regression, Support Vector Machine, to name a few. In ..."
Abstract
 Add to MetaCart
Stochastic coordinate descent, due to its practicality and efficiency, is increasingly popular in machine learning and signal processing communities as it has proven successful in several largescale optimization problems, such as l1 regularized regression, Support Vector Machine, to name a few
On stochastic gradient and subgradient methods with adaptive steplength sequences
 Automatica
, 2012
"... Traditionally, stochastic approximation (SA) schemes have been popular choices for solving stochastic optimization problems. However, the performance of standard SA implementations can vary significantly based on the choice of the steplength sequence, and in general, little guidance is provided abo ..."
Abstract

Cited by 15 (4 self)
 Add to MetaCart
of the original measure and the artificially introduced distribution. The resulting adaptive steplength schemes are applied to three stochastic optimization problems. In particular, we observe that both schemes perform well in practice and display markedly less reliance on userdefined parameters. I.
Bayesian inference on phylogeny and its impact on evolutionary biology.
 Science
, 2001
"... 1 As a discipline, phylogenetics is becoming transformed by a flood of molecular data. These data allow broad questions to be asked about the history of life, but also present difficult statistical and computational problems. Bayesian inference of phylogeny brings a new perspective to a number of o ..."
Abstract

Cited by 235 (10 self)
 Add to MetaCart
of outstanding issues in evolutionary biology, including the analysis of large phylogenetic trees and complex evolutionary models and the detection of the footprint of natural selection in DNA sequences. T he idea that species are related through a history of common descent is an old one, predating Darwin. Yet
Results 1  10
of
91