Results 1  10
of
160
Proof verification and hardness of approximation problems
 IN PROC. 33RD ANN. IEEE SYMP. ON FOUND. OF COMP. SCI
, 1992
"... We show that every language in NP has a probablistic verifier that checks membership proofs for it using logarithmic number of random bits and by examining a constant number of bits in the proof. If a string is in the language, then there exists a proof such that the verifier accepts with probabilit ..."
Abstract

Cited by 797 (39 self)
 Add to MetaCart
We show that every language in NP has a probablistic verifier that checks membership proofs for it using logarithmic number of random bits and by examining a constant number of bits in the proof. If a string is in the language, then there exists a proof such that the verifier accepts with probability 1 (i.e., for every choice of its random string). For strings not in the language, the verifier rejects every provided “proof " with probability at least 1/2. Our result builds upon and improves a recent result of Arora and Safra [6] whose verifiers examine a nonconstant number of bits in the proof (though this number is a very slowly growing function of the input length). As a consequence we prove that no MAX SNPhard problem has a polynomial time approximation scheme, unless NP=P. The class MAX SNP was defined by Papadimitriou and Yannakakis [82] and hard problems for this class include vertex cover, maximum satisfiability, maximum cut, metric TSP, Steiner trees and shortest superstring. We also improve upon the clique hardness results of Feige, Goldwasser, Lovász, Safra and Szegedy [42], and Arora and Safra [6] and shows that there exists a positive ɛ such that approximating the maximum clique size in an Nvertex graph to within a factor of N ɛ is NPhard.
A Parallel Repetition Theorem
 SIAM Journal on Computing
, 1998
"... We show that a parallel repetition of any twoprover oneround proof system (MIP(2, 1)) decreases the probability of error at an exponential rate. No constructive bound was previously known. The constant in the exponent (in our analysis) depends only on the original probability of error and on the t ..."
Abstract

Cited by 362 (9 self)
 Add to MetaCart
(Show Context)
We show that a parallel repetition of any twoprover oneround proof system (MIP(2, 1)) decreases the probability of error at an exponential rate. No constructive bound was previously known. The constant in the exponent (in our analysis) depends only on the original probability of error and on the total number of possible answers of the two provers. The dependency on the total number of possible answers is logarithmic, which was recently proved to be almost the best possible [U. Feige and O. Verbitsky, Proc. 11th Annual IEEE Conference on Computational Complexity, IEEE Computer Society Press, Los Alamitos, CA, 1996, pp. 7076].
Truth revelation in approximately efficient combinatorial auctions
 Journal of the ACM
, 2002
"... Abstract. Some important classical mechanisms considered in Microeconomics and Game Theory require the solution of a difficult optimization problem. This is true of mechanisms for combinatorial auctions, which have in recent years assumed practical importance, and in particular of the gold standard ..."
Abstract

Cited by 230 (1 self)
 Add to MetaCart
Abstract. Some important classical mechanisms considered in Microeconomics and Game Theory require the solution of a difficult optimization problem. This is true of mechanisms for combinatorial auctions, which have in recent years assumed practical importance, and in particular of the gold standard for combinatorial auctions, the Generalized Vickrey Auction (GVA). Traditional analysis of these mechanisms—in particular, their truth revelation properties—assumes that the optimization problems are solved precisely. In reality, these optimization problems can usually be solved only in an approximate fashion. We investigate the impact on such mechanisms of replacing exact solutions by approximate ones. Specifically, we look at a particular greedy optimization method. We show that the GVA payment scheme does not provide for a truth revealing mechanism. We introduce another scheme that does guarantee truthfulness for a restricted class of players. We demonstrate the latter property by identifying natural properties for combinatorial auctions and showing that, for our restricted class of players, they imply that truthful strategies are dominant. Those properties have applicability beyond the specific auction studied.
Combinatorial auctions: A survey
, 2000
"... Many auctions involve the sale of a variety of distinct assets. Examples are airport time slots, delivery routes and furniture. Because of complementarities (or substitution effects) between the different assets, bidders have preferences not just for particular items but for sets or bundles of items ..."
Abstract

Cited by 215 (1 self)
 Add to MetaCart
Many auctions involve the sale of a variety of distinct assets. Examples are airport time slots, delivery routes and furniture. Because of complementarities (or substitution effects) between the different assets, bidders have preferences not just for particular items but for sets or bundles of items. For this reason, economic efficiency is enhanced if bidders are allowed to bid on bundles or combinations of different assets. This paper surveys the state of knowledge about the design of combinatorial auctions. Second, it uses this subject as a vehicle to convey the aspects of integer programming that are relevant for the
Winner determination in combinatorial auction generalizations
, 2002
"... Combinatorial markets where bids can be submitted on bundles of items can be economically desirable coordination mechanisms in multiagent systems where the items exhibit complementarity and substitutability. There has been a surge of recent research on winner determination in combinatorial auctions. ..."
Abstract

Cited by 175 (23 self)
 Add to MetaCart
Combinatorial markets where bids can be submitted on bundles of items can be economically desirable coordination mechanisms in multiagent systems where the items exhibit complementarity and substitutability. There has been a surge of recent research on winner determination in combinatorial auctions. In this paper we study a wider range of combinatorial market designs: auctions, reverse auctions, and exchanges, with one or multiple units of each item, with and without free disposal. We first theoretically characterize the complexity. The most interesting results are that reverse auctions with free disposal can be approximated, and in all of the cases without free disposal, even finding a feasible solution is ÆÈcomplete. We then ran experiments on known benchmarks as well as ones which we introduced, to study the complexity of the market variants in practice. Cases with free disposal tended to be easier than ones without. On many distributions, reverse auctions with free disposal were easier than auctions with free disposal— as the approximability would suggest—but interestingly, on one of the most realistic distributions they were harder. Singleunit exchanges were easy, but multiunit exchanges were extremely hard. 1
eMediator: A Next Generation Electronic Commerce Server
 Computational Intelligence
, 2002
"... This paper presents eMediator, an electronic commerce server prototype that demonstrates ways in which algorithmic support and gametheoretic incentive engineering can jointly improve the efficiency of ecommerce. eAuctionHouse, the configurable auction server, includes a variety of generalized combi ..."
Abstract

Cited by 123 (32 self)
 Add to MetaCart
(Show Context)
This paper presents eMediator, an electronic commerce server prototype that demonstrates ways in which algorithmic support and gametheoretic incentive engineering can jointly improve the efficiency of ecommerce. eAuctionHouse, the configurable auction server, includes a variety of generalized combinatorial auctions and exchanges, pricing schemes, bidding languages, mobile agents, and user support for choosing an auction type. We introduce two new logical bidding languages for combinatorial markets: the XOR bidding language and the ORofXORs bidding language. Unlike the traditional OR bidding language, these are fully expressive. They therefore enable the use of the ClarkeGroves pricing mechanism for motivating the bidders to bid truthfully. eAuctionHouse also supports supply/demand curve bidding. eCommitter, the leveled commitment contract optimizer, determines the optimal contract price and decommitting penalties for a variety of leveled commitment contracting mechanisms, taking into account that rational agents will decommit strategically in Nash equilibrium. It also determines the optimal decommitting strategies for any given leveled commitment contract. eExchangeHouse, the safe exchange planner, enables unenforced anonymous exchanges by dividing the exchange into chunks and sequencing those chunks to be delivered safely in alternation between the buyer and the seller.
The Importance of Being Biased
, 2002
"... The Minimum Vertex Cover problem is the problem of, given a graph, finding a smallest set of vertices that touches all edges. We show that it is NPhard to approximate this problem 1.36067, improving on the previously known hardness result for a 7/6 factor. ..."
Abstract

Cited by 88 (7 self)
 Add to MetaCart
The Minimum Vertex Cover problem is the problem of, given a graph, finding a smallest set of vertices that touches all edges. We show that it is NPhard to approximate this problem 1.36067, improving on the previously known hardness result for a 7/6 factor.
Improved Inapproximability Results for MaxClique, Chromatic Number and Approximate Graph Coloring
"... In this paper, we present improved inapproximability results for three problems: the problem of finding the maximum clique size in a graph, the problem of finding the chromatic number of a graph, and the problem of coloring a graph with a small chromatic number with a small numberof colors. H*ast ..."
Abstract

Cited by 72 (8 self)
 Add to MetaCart
(Show Context)
In this paper, we present improved inapproximability results for three problems: the problem of finding the maximum clique size in a graph, the problem of finding the chromatic number of a graph, and the problem of coloring a graph with a small chromatic number with a small numberof colors. H*astad's celebrated result [13] shows that the maximumclique size in a graph with n vertices is inapproximable inpolynomial time within a factor n1ffl for arbitrarily smallconstant ffl> 0 unless NP=ZPP. In this paper, we aimat getting the best subconstant value of ffl in H*astad's result. We prove that clique size is inapproximable within a factor n2(log n)1fl (corresponding to ffl = 1(log n)fl) for some constant fl> 0 unless NP ` ZPTIME(2(log n) O(1)). This improves the previous best inapproximability factor of
Approximations of Weighted Independent Set and Hereditary Subset Problems
 JOURNAL OF GRAPH ALGORITHMS AND APPLICATIONS
, 2000
"... The focus of this study is to clarify the approximability of weighted versions of the maximum independent set problem. In particular, we report improved performance ratios in boundeddegree graphs, inductive graphs, and general graphs, as well as for the unweighted problem in sparse graphs. Wher ..."
Abstract

Cited by 71 (6 self)
 Add to MetaCart
The focus of this study is to clarify the approximability of weighted versions of the maximum independent set problem. In particular, we report improved performance ratios in boundeddegree graphs, inductive graphs, and general graphs, as well as for the unweighted problem in sparse graphs. Where possible, the techniques are applied to related hereditary subgraph and subset problem, obtaining ratios better than previously reported for e.g. Weighted Set Packing, Longest Common Subsequence, and Independent Set in hypergraphs.