Results 1  10
of
87
Vertex Cover Might be Hard to Approximate to within 2  ɛ
"... Based on a conjecture regarding the power of unique 2prover1round games presented in [Khot02], we show that vertex cover is hard to approximate within any constant factor better than 2. We actually show a stronger result, namely, based on the same conjecture, vertex cover on kuniform hypergraph ..."
Abstract

Cited by 151 (11 self)
 Add to MetaCart
Based on a conjecture regarding the power of unique 2prover1round games presented in [Khot02], we show that vertex cover is hard to approximate within any constant factor better than 2. We actually show a stronger result, namely, based on the same conjecture, vertex cover on kuniform hypergraphs is hard to approximate within any constant factor better than k.
Conditional hardness for approximate coloring
 In STOC 2006
, 2006
"... We study the APPROXIMATECOLORING(q, Q) problem: Given a graph G, decide whether χ(G) ≤ q or χ(G) ≥ Q (where χ(G) is the chromatic number of G). We derive conditional hardness for this problem for any constant 3 ≤ q < Q. For q ≥ 4, our result is based on Khot’s 2to1 conjecture [Khot’02]. For ..."
Abstract

Cited by 46 (14 self)
 Add to MetaCart
We study the APPROXIMATECOLORING(q, Q) problem: Given a graph G, decide whether χ(G) ≤ q or χ(G) ≥ Q (where χ(G) is the chromatic number of G). We derive conditional hardness for this problem for any constant 3 ≤ q < Q. For q ≥ 4, our result is based on Khot’s 2to1 conjecture [Khot’02]. For q = 3, we base our hardness result on a certain ‘⊲< shaped ’ variant of his conjecture. We also prove that the problem ALMOST3COLORINGε is hard for any constant ε> 0, assuming Khot’s Unique Games conjecture. This is the problem of deciding for a given graph, between the case where one can 3color all but a ε fraction of the vertices without monochromatic edges, and the case where the graph contains no independent set of relative size at least ε. Our result is based on bounding various generalized noisestability quantities using the invariance principle of Mossel et al [MOO’05].
On Approximating the Minimum Vertex Cover in Sublinear Time and the Connection to Distributed Algorithms
 ELECTRONIC COLLOQUIUM ON COMPUTATIONAL COMPLEXITY, REPORT NO. 94 (2005)
, 2005
"... We consider the problem of estimating the size, V C(G), of a minimum vertex cover of a graph G, in sublinear time, by querying the incidence relation of the graph. We say that an algorithm is an (α, ɛ)approximation algorithm if it outputs with high probability an estimate V C such that V C(G) − ɛn ..."
Abstract

Cited by 40 (6 self)
 Add to MetaCart
We consider the problem of estimating the size, V C(G), of a minimum vertex cover of a graph G, in sublinear time, by querying the incidence relation of the graph. We say that an algorithm is an (α, ɛ)approximation algorithm if it outputs with high probability an estimate V C such that V C(G) − ɛn ≤ V C ≤ α · V C(G) + ɛn, where n is the number of vertices of G. We show that the query complexity of such algorithms must grow at least linearly with the average degree ¯ d of the graph. In particular this means that for dense graphs it is not possible to design an algorithm whose complexity is o(n). On the positive side we first describe a simple (O(log ( ¯ d/ɛ), ɛ)approximation algorithm, whose query complexity is ɛ −2 · ( ¯ d/ɛ) log ( ¯ d/ɛ)+O(1) We then show a reduction from local distributed approximation algorithms to sublinear approximation algorithms. Using this reduction and the distributed algorithm of Kuhn, Moscibroda, and Wattenhofer [KMW05] we can get an (O(1), ɛ)approximation algorithm, whose query complexity is ɛ −2 · ( ¯ d/ɛ) O(log ( ¯ d/ɛ) ISSN 14338092
Approximability of Combinatorial Problems with Multiagent Submodular Cost Functions
"... Abstract — Applications in complex systems such as the Internet have spawned recent interest in studying situations involving multiple agents with their individual cost or utility functions. In this paper, we introduce an algorithmic framework for studying combinatorial problems in the presence of m ..."
Abstract

Cited by 32 (6 self)
 Add to MetaCart
(Show Context)
Abstract — Applications in complex systems such as the Internet have spawned recent interest in studying situations involving multiple agents with their individual cost or utility functions. In this paper, we introduce an algorithmic framework for studying combinatorial problems in the presence of multiple agents with submodular cost functions. We study several fundamental covering problems (Vertex Cover, Shortest Path, Perfect Matching, and Spanning Tree) in this setting and establish tight upper and lower bounds for the approximability of these problems. 1.
Inapproximability of Vertex Cover and Independent Set in Bounded Degree Graphs
"... We study the inapproximability of Vertex Cover and Independent Set on degree d graphs. We prove that: • Vertex Cover is Unique Gameshard to approximate log log d to within a factor 2−(2+od(1)). This exactly log d matches the algorithmic result of Halperin [1] up to the od(1) term. • Independent Set ..."
Abstract

Cited by 21 (0 self)
 Add to MetaCart
We study the inapproximability of Vertex Cover and Independent Set on degree d graphs. We prove that: • Vertex Cover is Unique Gameshard to approximate log log d to within a factor 2−(2+od(1)). This exactly log d matches the algorithmic result of Halperin [1] up to the od(1) term. • Independent Set is Unique Gameshard to approxid mate to within a factor O( log2). This improves the d d logO(1) Unique Games hardness result of Samorod
Approximation algorithms and hardness results for labeled connectivity problems
 In 31st MFCS
, 2006
"... Abstract. Let G = (V, E) be a connected multigraph, whose edges are associated with labels specified by an integervalued function L: E → N. In addition, each label ℓ ∈ N to which at least one edge is mapped has a nonnegative cost c(ℓ). The minimum label spanning tree problem (MinLST) asks to find ..."
Abstract

Cited by 20 (5 self)
 Add to MetaCart
(Show Context)
Abstract. Let G = (V, E) be a connected multigraph, whose edges are associated with labels specified by an integervalued function L: E → N. In addition, each label ℓ ∈ N to which at least one edge is mapped has a nonnegative cost c(ℓ). The minimum label spanning tree problem (MinLST) asks to find a spanning tree in G that minimizes the overall cost of the labels used by its edges. Equivalently, we aim at finding a minimum cost subset of labels I ⊆ N such that the edge set {e ∈ E: L(e) ∈ I} forms a connected subgraph spanning all vertices. Similarly, in the minimum label st path problem (MinLP) the goal is to identify an st path minimizing the combined cost of its labels, where s and t are provided as part of the input. The main contributions of this paper are improved approximation algorithms and hardness results for MinLST and MinLP. As a secondary objective, we make a concentrated effort to relate the algorithmic methods utilized in approximating these problems to a number of wellknown techniques, originally studied in the context of integer covering. 1
On short paths interdiction problems: total and nodewise limited interdiction
, 2007
"... ..."
(Show Context)
New lower bounds for Vertex Cover in the LovászSchrijver hierarchy
, 2006
"... Lovász and Schrijver [13] defined three progressively stronger procedures LS0, LS and LS+, for systematically tightening linear relaxations over many rounds. All three procedures yield the integral hull after at most n rounds. On the other hand, constant rounds of LS+ can derive the relaxations be ..."
Abstract

Cited by 16 (1 self)
 Add to MetaCart
Lovász and Schrijver [13] defined three progressively stronger procedures LS0, LS and LS+, for systematically tightening linear relaxations over many rounds. All three procedures yield the integral hull after at most n rounds. On the other hand, constant rounds of LS+ can derive the relaxations behind many famous approximation algorithms such as those for MAXCUT, SPARSESTCUT. So proving round lower bounds for these procedures on specific problems may give evidence about inapproximability. We prove new round lower bounds for VERTEX COVER in the LS hierarchy. Arora et al. [3] showed that the integrality gap for VERTEX COVER relaxations remains 2−o(1) even after Ω(log n) rounds LS. However, their method can only prove round lower bounds as large as the girth of the input graph, which is O(log n) for interesting graphs. We break through this “girth barrier ” and show that the integrality gap for VERTEX COVER remains 1.5 − ɛ even after Ω(log 2 n) rounds of LS. In contrast, the best PCPbased results only rule out 1.36approximations. Moreover, we conjecture that the new technique we introduce to prove our lower bound, the “fence ” method, may lead to linear or nearly linear LS round lower bounds for VERTEX COVER.
Private Approximation of Search Problems
 ELECTRONIC COLLOQUIUM ON COMPUTATIONAL COMPLEXITY
, 2005
"... Many approximation algorithms have been presented in the last decades for hard search problems. The focus of this paper is on cryptographic applications, where it is desired to design algorithms which do not leak unnecessary information. Specifically, we are interested in private approximation algor ..."
Abstract

Cited by 15 (2 self)
 Add to MetaCart
(Show Context)
Many approximation algorithms have been presented in the last decades for hard search problems. The focus of this paper is on cryptographic applications, where it is desired to design algorithms which do not leak unnecessary information. Specifically, we are interested in private approximation algorithms – efficient algorithms whose output does not leak information not implied by the optimal solutions to the search problems. Privacy requirements add constraints on the approximation algorithms; in particular, known approximation algorithms usually leak a lot of information. For functions, [Feigenbaum et al., ICALP 2001] presented a natural requirement that a private algorithm should not leak information not implied by the original function. Generalizing this requirement to search problems is not straight forward as an input may have many different outputs. We present a new definition that captures a minimal privacy requirement from such algorithms – applied to an input instance, it should not leak any information that is not implied by its collection of exact solutions. Although our privacy requirement seems minimal, we show that for well studied problems, as vertex cover and maximum exact 3SAT, private approximation algorithms are unlikely to exist even for poor approximation ratios. Similar to [Halevi et al., STOC 2001], we define a relaxed notion of approximation algorithms that leak (little) information, and demonstrate the applicability of this notion by showing near optimal approximation algorithms for maximum exact 3SAT which leak little information.
Set systems without a simplex or a cluster
, 2007
"... A ddimensional simplex is a collection of d+1 sets with empty intersection, every d of which have nonempty intersection. A kuniform dcluster is a collection of d + 1 sets of size k with empty intersection and union of size at most 2k. We prove the following result which simultaneously addresses a ..."
Abstract

Cited by 15 (6 self)
 Add to MetaCart
A ddimensional simplex is a collection of d+1 sets with empty intersection, every d of which have nonempty intersection. A kuniform dcluster is a collection of d + 1 sets of size k with empty intersection and union of size at most 2k. We prove the following result which simultaneously addresses an old conjecture of Chvátal [7] and a recent conjecture of the second author [28]. For d ≥ 2 and ζ> 0 there is a number T such that the following holds for sufficiently large n. Let G be a kuniform set system on [n] = {1, · · · , n} with ζn < k < n/2 − T, and suppose either that G contains no ddimensional simplex or that G contains no dcluster. Then G  ≤ � � n−1 k−1 with equality only for the family of all ksets containing a specific element. In the nonuniform setting we obtain the following exact result that generalises an old question of Erdős and a result of Milner [11], who proved the case d = 2. Suppose d ≥ 2 and G is a set system on [n] that does not contain a ddimensional simplex, with n sufficiently large. Then G  ≤ 2n−1 + �d−1 � � n−1 i=0 i, with equality only for the family consisting of all sets that either contain some specific element or have size at most d − 1. Each of these results is proved via the corresponding stability result, which gives structural information on any G whose size is close to maximum. These in turn rely on a stability result that we obtain for intersecting families, which supersedes a result of Friedgut [17] that was proved using spectral techniques, and is based on a purely combinatorial result of Frankl.