Results 1  10
of
62
FemtoCaching: Wireless Video Content Delivery through Distributed Caching Helpers,” submitted for publication, available on http://arxiv.org/pdf/1109.4179v1
"... Abstract—We suggest a novel approach to handle the ongoing explosive increase in the demand for video content in mobile devices. We envision femtocelllike base stations, which we call helpers, with weak backhaul links but large storage capabilities. These helpers form a wireless distributed caching ..."
Abstract

Cited by 56 (9 self)
 Add to MetaCart
Abstract—We suggest a novel approach to handle the ongoing explosive increase in the demand for video content in mobile devices. We envision femtocelllike base stations, which we call helpers, with weak backhaul links but large storage capabilities. These helpers form a wireless distributed caching network that assists the macro base station by handling requests of popular files that have been cached. We formalize the wireless distributed caching optimization problem for the case that files are encoded using fountain/MDS codes. We express the problem as a convex optimization. By adding additional variables we reduce it to a linear program. On the practical side, we present a detailed simulation of a university campus scenario covered by a single 3GPP LTE R8 cell and several helper nodes using a simplified 802.11n protocol. We use a real campus trace of video requests and show how distributed caching can increase the number of served users by as much as 600 − 700%. I.
Nonmonotone submodular maximization under matroid and knapsack constraints
 In Proc. 41th ACM Symp. on Theory of Computing
, 2009
"... Submodular function maximization is a central problem in combinatorial optimization, generalizing many important problems including Max Cut in directed/undirected graphs and in hypergraphs, certain constraint satisfaction problems, maximum entropy sampling, and maximum facility location problems. Un ..."
Abstract

Cited by 37 (1 self)
 Add to MetaCart
(Show Context)
Submodular function maximization is a central problem in combinatorial optimization, generalizing many important problems including Max Cut in directed/undirected graphs and in hypergraphs, certain constraint satisfaction problems, maximum entropy sampling, and maximum facility location problems. Unlike submodular minimization, submodular maximization is NPhard. In this paper, we give the first constantfactor approximation algorithm for maximizing any nonnegative submodular function subject to multiple matroid or knapsack constraints. We emphasize that our results are for nonmonotone submodular functions. In particular, for any constant k, we present a 1 k+2+ 1 k +ǫapproximation for the submodular maximization problem under k matroid constraints, 1 k+ǫ and a ( 1 5 − ǫ)approximation algorithm for this problem subject to k knapsack constraints (ǫ> 0 is 1 any constant). We improve the approximation guarantee of our algorithm to k+1+ 1 for k ≥ 2 k−1 +ǫ partition matroid constraints. This idea also gives aapproximation for maximizing a monotone submodular function subject to k ≥ 2 partition matroids, which improves over the previously best known guarantee of
Constrained nonmonotone submodular maximization: offline and secretary algorithms
, 2010
"... Constrained submodular maximization problems have long been studied, with nearoptimal results known under a variety of constraints when the submodular function is monotone. The case of nonmonotone submodular maximization is less understood: the first approximation algorithms even for the unconstrai ..."
Abstract

Cited by 22 (0 self)
 Add to MetaCart
Constrained submodular maximization problems have long been studied, with nearoptimal results known under a variety of constraints when the submodular function is monotone. The case of nonmonotone submodular maximization is less understood: the first approximation algorithms even for the unconstrainted setting were given by Feige et al. (FOCS ’07). More recently, Lee et al. (STOC ’09, APPROX ’09) show how to approximately maximize nonmonotone submodular functions when the constraints are given by the intersection of
MaxSum Diversification, Monotone Submodular Functions and Dynamic Updates (Extended Abstract)
, 2012
"... ..."
(Show Context)
On kColumn Sparse Packing Programs
, 2009
"... We consider the class of packing integer programs (PIPs) that are column sparse, i.e. there is a specified upper bound k on the number of constraints that each variable appears in. We give an ek+o(k)approximation algorithm for kcolumn sparse PIPs, improving on recent results of k2 · 2k [14] and O(k ..."
Abstract

Cited by 16 (3 self)
 Add to MetaCart
(Show Context)
We consider the class of packing integer programs (PIPs) that are column sparse, i.e. there is a specified upper bound k on the number of constraints that each variable appears in. We give an ek+o(k)approximation algorithm for kcolumn sparse PIPs, improving on recent results of k2 · 2k [14] and O(k2) [3, 5]. We also show that the integrality gap of our linear programming relaxation is at least 2k − 1; it is known that kcolumn sparse PIPs are Ω(k log k)hard to approximate [8]. We also extend our result (at the loss of a small constant factor) to the more general case of maximizing a submodular objective over kcolumn sparse packing constraints.
Nonmonotone submodular maximization via a structural continuous greedy algorithm (Extended Abstract)
 IN ICALP
, 2011
"... Consider a suboptimal solution S for a maximization problem. Suppose S’s value is small compared to an optimal solution OP T to the problem, yet S is structurally similar to OP T. A natural question in this setting is whether there is a way of improving S based solely on this information. In this p ..."
Abstract

Cited by 11 (2 self)
 Add to MetaCart
(Show Context)
Consider a suboptimal solution S for a maximization problem. Suppose S’s value is small compared to an optimal solution OP T to the problem, yet S is structurally similar to OP T. A natural question in this setting is whether there is a way of improving S based solely on this information. In this paper we introduce the Structural Continuous Greedy Algorithm, answering this question affirmatively in the setting of the Nonmonotone Submodular Maximization Problem. We improve on the best approximation factor known for this problem. In the Nonmonotone Submodular Maximization Problem we are given a nonnegative submodular function f, and the objective is to find a subset maximizing f. Our method yields an 0.42approximation for this problem, improving on the current best approximation factor of 0.41 given by Gharan and Vondrák [5]. On the other hand, Feige et al. [4] showed a lower bound of 0.5 for this problem.
Fast algorithms for maximizing submodular functions
 In SODA
, 2014
"... There has been much progress recently on improved approximations for problems involving submodular objective functions, and many interesting techniques have been developed. However, the resulting algorithms are often slow and impractical. In this paper we develop algorithms that match the best know ..."
Abstract

Cited by 11 (2 self)
 Add to MetaCart
(Show Context)
There has been much progress recently on improved approximations for problems involving submodular objective functions, and many interesting techniques have been developed. However, the resulting algorithms are often slow and impractical. In this paper we develop algorithms that match the best known approximation guarantees, but with significantly improved running times, for maximizing a monotone submodular function f: 2[n] → R+ subject to various constraints. As in previous work, we measure the number of oracle calls to the objective function which is the dominating term in the running time. Our first result is a simple algorithm that gives a (1 − 1/e − )approximation for a cardinality constraint using O(n log n ) queries, and a 1/(p + 2 ` + 1 + )approximation for the intersection of a psystem and ` knapsack (linear) constraints using O ( n2 log 2 n ) queries. This is the first approximation for a psystem combined with linear constraints. (We also show that the factor of p cannot be improved for maximizing over a psystem.) The main idea behind these algorithms serves as a building block in our more sophisticated algorithms. Our main result is a new variant of the continuous greedy algorithm, which interpolates between the classical greedy algorithm and a truly continuous algorithm. We show how this algorithm can be implemented for matroid and knapsack constraints using Õ(n2) oracle calls to the objective function. (Previous variants and alternative techniques were known to use at least Õ(n4) oracle calls.) This leads to an O(n 2 4 log 2 n )time (1 − 1/e − )approximation for a matroid constraint. For a knapsack constraint, we develop a more involved (1−1/e − )approximation algorithm that runs in time O(n2 ( 1 log n) poly(1/)).
Matroid matching: the power of local search
 IN STOC
, 2010
"... We consider the classical matroid matching problem. Unweighted matroid matching for linearlyrepresented matroids was solved by Lovász, and the problem is known to be intractable for general matroids. We present a PTAS for unweighted matroid matching for general matroids. In contrast, we show that ..."
Abstract

Cited by 10 (1 self)
 Add to MetaCart
(Show Context)
We consider the classical matroid matching problem. Unweighted matroid matching for linearlyrepresented matroids was solved by Lovász, and the problem is known to be intractable for general matroids. We present a PTAS for unweighted matroid matching for general matroids. In contrast, we show that natural LP relaxations that have been studied have an Ω(n) integrality gap and, moreover, Ω(n) rounds of the SheraliAdams hierarchy are necessary to bring the gap down to a constant. More generally, for any fixed k ≥ 2 and ɛ> 0, we obtain a (k/2 + ɛ)approximation for matroid matching in kuniform hypergraphs, also known as the matroid kparity problem. As a consequence, we obtain a (k/2+ɛ)approximation for the problem of finding the maximumcardinality set in the intersection of k matroids. We also give a 3/2approximation for the weighted version of a special case of matroid matching, the matchoid problem.
Thresholded Covering Algorithms for Robust and MaxMin Optimization
, 2009
"... The general problem of robust optimization is this: one of several possible scenarios will appear tomorrow, but things are more expensive tomorrow than they are today. What should you anticipatorily buy today, so that the worstcase cost (summed over both days) is minimized? For example, in a set co ..."
Abstract

Cited by 9 (4 self)
 Add to MetaCart
(Show Context)
The general problem of robust optimization is this: one of several possible scenarios will appear tomorrow, but things are more expensive tomorrow than they are today. What should you anticipatorily buy today, so that the worstcase cost (summed over both days) is minimized? For example, in a set cover instance, if any one of the () n k subsets of the universe that have size k may appear tomorrow, what is a good course of action? Feige et al. [FJMM07], and later, Khandekar et al. [KKMS08], considered this krobust model where the possible outcomes tomorrow are given by all demandsubsets of size k, and gave algorithms for the set cover problem, and the Steiner tree and facility location problems in this model, respectively. In this paper, we give the following simple and intuitive template for krobust problems: having built some anticipatory solution, if there exists a single demand whose augmentation cost is larger than some threshold (which is ≈ Opt/k), augment the anticipatory solution to cover this demand as well, and repeat. In this paper we show that this template gives us approximation algorithms for krobust versions of Steiner tree and set