Results 1  10
of
266,857
Maximizing a Monotone Submodular Function subject to a Matroid Constraint
, 2008
"... Let f: 2 X → R+ be a monotone submodular set function, and let (X, I) be a matroid. We consider the problem maxS∈I f(S). It is known that the greedy algorithm yields a 1/2 approximation [14] for this problem. For certain special cases, e.g. max S≤k f(S), the greedy algorithm yields a (1 − 1/e)app ..."
Abstract

Cited by 63 (0 self)
 Add to MetaCart
Let f: 2 X → R+ be a monotone submodular set function, and let (X, I) be a matroid. We consider the problem maxS∈I f(S). It is known that the greedy algorithm yields a 1/2 approximation [14] for this problem. For certain special cases, e.g. max S≤k f(S), the greedy algorithm yields a (1 − 1/e
Adaptive submodular optimization under matroid constraints
, 2011
"... Many important problems in discrete optimization require maximization of a monotonic submodular function subject to matroid constraints. For these problems, a simple greedy algorithm is guaranteed to obtain nearoptimal solutions. In this article, we extend this classic result to a general class of ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
Many important problems in discrete optimization require maximization of a monotonic submodular function subject to matroid constraints. For these problems, a simple greedy algorithm is guaranteed to obtain nearoptimal solutions. In this article, we extend this classic result to a general class
Submodular functions, matroids and certain polyhedra
, 2003
"... The viewpoint of the subject of matroids, and related areas of lattice theory, has always been, in one way or another, abstraction of algebraic dependence or, equivalently, abstraction of the incidence relations in geometric representations of algebra. Often one of the main derived facts is that all ..."
Abstract

Cited by 352 (0 self)
 Add to MetaCart
The viewpoint of the subject of matroids, and related areas of lattice theory, has always been, in one way or another, abstraction of algebraic dependence or, equivalently, abstraction of the incidence relations in geometric representations of algebra. Often one of the main derived facts
Nonmonotone submodular maximization under matroid and knapsack constraints
 In Proc. 41th ACM Symp. on Theory of Computing
, 2009
"... Submodular function maximization is a central problem in combinatorial optimization, generalizing many important problems including Max Cut in directed/undirected graphs and in hypergraphs, certain constraint satisfaction problems, maximum entropy sampling, and maximum facility location problems. Un ..."
Abstract

Cited by 37 (1 self)
 Add to MetaCart
. Unlike submodular minimization, submodular maximization is NPhard. In this paper, we give the first constantfactor approximation algorithm for maximizing any nonnegative submodular function subject to multiple matroid or knapsack constraints. We emphasize that our results are for nonmonotone
Submodular Maximization with Cardinality Constraints
"... We consider the problem of maximizing a (nonmonotone) submodular function subject to a cardinality constraint. In addition to capturing wellknown combinatorial optimization problems, e.g., MaxkCoverage and MaxBisection, this problem has applications in other more practical settings such as nat ..."
Abstract
 Add to MetaCart
We consider the problem of maximizing a (nonmonotone) submodular function subject to a cardinality constraint. In addition to capturing wellknown combinatorial optimization problems, e.g., MaxkCoverage and MaxBisection, this problem has applications in other more practical settings
Greedy Function Approximation: A Gradient Boosting Machine
 Annals of Statistics
, 2000
"... Function approximation is viewed from the perspective of numerical optimization in function space, rather than parameter space. A connection is made between stagewise additive expansions and steepest{descent minimization. A general gradient{descent \boosting" paradigm is developed for additi ..."
Abstract

Cited by 951 (12 self)
 Add to MetaCart
Function approximation is viewed from the perspective of numerical optimization in function space, rather than parameter space. A connection is made between stagewise additive expansions and steepest{descent minimization. A general gradient{descent \boosting" paradigm is developed
The Extended Linear Complementarity Problem
, 1993
"... We consider an extension of the horizontal linear complementarity problem, which we call the extended linear complementarity problem (XLCP). With the aid of a natural bilinear program, we establish various properties of this extended complementarity problem; these include the convexity of the biline ..."
Abstract

Cited by 776 (28 self)
 Add to MetaCart
of the bilinear objective function under a monotonicity assumption, the polyhedrality of the solution set of a monotone XLCP, and an error bound result for a nondegenerate XLCP. We also present a finite, sequential linear programming algorithm for solving the nonmonotone XLCP.
Submodular Maximization Over Multiple Matroids via Generalized Exchange Properties
, 2009
"... Submodularfunction maximization is a central problem in combinatorial optimization, generalizing many important NPhard problems including Max Cut in digraphs, graphs and hypergraphs, certain constraint satisfaction problems, maximumentropy sampling, and maximum facilitylocation problems. Our mai ..."
Abstract

Cited by 45 (6 self)
 Add to MetaCart
main result is that for any k ≥ 2 and any ε> 0, there is a natural localsearch algorithm which has approximation guarantee of 1/(k + ε) for the problem of maximizing a monotone submodular function subject to k matroid constraints. This improves a 1/(k + 1)approximation of Nemhauser, Wolsey
Algorithms for Nonnegative Matrix Factorization
 In NIPS
, 2001
"... Nonnegative matrix factorization (NMF) has previously been shown to be a useful decomposition for multivariate data. Two different multiplicative algorithms for NMF are analyzed. They differ only slightly in the multiplicative factor used in the update rules. One algorithm can be shown to minim ..."
Abstract

Cited by 1230 (5 self)
 Add to MetaCart
to minimize the conventional least squares error while the other minimizes the generalized KullbackLeibler divergence. The monotonic convergence of both algorithms can be proven using an auxiliary function analogous to that used for proving convergence of the ExpectationMaximization algorithm
Results 1  10
of
266,857