Results 11  20
of
133
Approximating Submodular Functions Everywhere
"... Submodular functions are a key concept in combinatorial optimization. Algorithms that involve submodular functions usually assume that they are given by a (value) oracle. Many interesting problems involving submodular functions can be solved using only polynomially many queries to the oracle, e.g., ..."
Abstract

Cited by 45 (4 self)
 Add to MetaCart
Submodular functions are a key concept in combinatorial optimization. Algorithms that involve submodular functions usually assume that they are given by a (value) oracle. Many interesting problems involving submodular functions can be solved using only polynomially many queries to the oracle, e.g., exact minimization or approximate maximization. In this paper, we consider the problem of approximating a nonnegative, monotone, submodular function f on a ground set of size n everywhere, after only poly(n) oracle queries. Our main result is a deterministic algorithm that makes poly(n) oracle queries and derives a function ˆ f such that, for every set S, ˆ f(S) approximates f(S) within a factor α(n), where α(n) = √ n + 1 for rank functions of matroids and α(n) = O ( √ n log n) for general monotone submodular functions. Our result is based on approximately finding a maximum volume inscribed ellipsoid in a symmetrized polymatroid, and the analysis involves various properties of submodular functions and polymatroids. Our algorithm is tight up to logarithmic factors. Indeed, we show that no algorithm can achieve a factor better than Ω ( √ n / log n), even for rank functions of a matroid.
Mechanisms for MultiUnit Auctions
 IN PROCEEDINGS OF THE ACM CONFERENCE ON ELECTRONIC COMMERCE (EC
, 2007
"... We present an incentivecompatible polynomialtime approximation scheme for multiunit auctions with general kminded player valuations. The mechanism fully optimizes over an appropriately chosen subrange of possible allocations and then uses VCG payments over this subrange. We show that obtaining ..."
Abstract

Cited by 37 (3 self)
 Add to MetaCart
(Show Context)
We present an incentivecompatible polynomialtime approximation scheme for multiunit auctions with general kminded player valuations. The mechanism fully optimizes over an appropriately chosen subrange of possible allocations and then uses VCG payments over this subrange. We show that obtaining a fully polynomialtime incentivecompatible approximation scheme, at least using VCG payments, is NPhard. For the case of valuations given by black boxes, we give a polynomialtime incentivecompatible 2approximation mechanism and show that no better is possible, at least using VCG payments.
From convex optimization to randomized mechanisms: Toward optimal combinatorial auctions
 In Proceedings of the 43rd annual ACM Symposium on Theory of Computing (STOC
, 2011
"... We design an expected polynomialtime, truthfulinexpectation, (1 − 1/e)approximation mechanism for welfare maximization in a fundamental class of combinatorial auctions. Our results apply to bidders with valuations that are matroid rank sums (MRS), which encompass mostconcreteexamplesofsubmodular ..."
Abstract

Cited by 34 (11 self)
 Add to MetaCart
We design an expected polynomialtime, truthfulinexpectation, (1 − 1/e)approximation mechanism for welfare maximization in a fundamental class of combinatorial auctions. Our results apply to bidders with valuations that are matroid rank sums (MRS), which encompass mostconcreteexamplesofsubmodularfunctionsstudiedinthiscontext,includingcoveragefunctions, matroid weightedrank functions, and convex combinations thereof. Our approximation factor is the best possible, even for known and explicitly given coverage valuations, assuming P ̸ = NP. Ours is the first truthfulinexpectation and polynomialtime mechanism to achieve a constantfactor approximation for an NPhard welfare maximization problem in combinatorial auctions with heterogeneous goods and restricted valuations. Our mechanism is an instantiation of a new framework for designing approximation mechanisms based on randomized rounding algorithms. A typical such algorithm first optimizes over a fractional relaxation of the original problem, and then randomly rounds the fractional solution to an integral one. With rare exceptions, such algorithms cannot be converted into truthful mechanisms. The highlevel idea of our mechanism design framework is to optimize directly
Setting lower bounds on truthfulness
 In Proceedings of the Eighteenth Annual ACMSIAM Symposium on Discrete Algorithms (SODA
, 2007
"... We present and discuss general techniques for proving inapproximability results for truthful mechanisms. We make use of these techniques to prove lower bounds on the approximability of several nonutilitarian multiparameter problems. In particular, we demonstrate the strength of our techniques by e ..."
Abstract

Cited by 29 (4 self)
 Add to MetaCart
(Show Context)
We present and discuss general techniques for proving inapproximability results for truthful mechanisms. We make use of these techniques to prove lower bounds on the approximability of several nonutilitarian multiparameter problems. In particular, we demonstrate the strength of our techniques by exhibiting a lower bound of 2 − 1 m for the scheduling problem with unrelated machines (formulated as a mechanism design problem in the seminal paper of Nisan and Ronen on Algorithmic Mechanism Design). Our lower bound applies to truthful randomized mechanisms (disregarding any computational assumptions on the running time of these mechanisms). Moreover, it holds even for the weaker notion of truthfulness for randomized mechanisms – i.e., truthfulness in expectation. This lower bound nearly matches the known 7 4 (randomized) truthful upper bound for the case of two machines (a nontruthful FPTAS exists). No lower bound for truthful randomized mechanisms in multiparameter settings was previously known. We show an application of our techniques to the workloadminimization problem in networks. We prove our lower bounds for this problem in the interdomain routing setting presented by Feigenbaum, Papadimitriou, Sami, and Shenker. Finally, we discuss several notions of nonutilitarian “fairness ” (MaxMin fairness, MinMax fairness, and envy minimization). We show how our techniques can be used to prove lower bounds for these notions.
Singlevalue combinatorial auctions and algorithmic implementation in undominated strategies
 In ACM Symposium on Discrete Algorithms
, 2011
"... In this paper we are interested in general techniques for designing mechanisms that approximate the social welfare in the presence of selfish rational behavior. We demonstrate our results in the setting of Combinatorial Auctions (CA). Our first result is a general deterministic technique to decouple ..."
Abstract

Cited by 26 (2 self)
 Add to MetaCart
(Show Context)
In this paper we are interested in general techniques for designing mechanisms that approximate the social welfare in the presence of selfish rational behavior. We demonstrate our results in the setting of Combinatorial Auctions (CA). Our first result is a general deterministic technique to decouple the algorithmic allocation problem from the strategic aspects, by a procedure that converts any algorithm to a dominantstrategy ascending mechanism. This technique works for any single value domain, in which each agent has the same value for each desired outcome, and this value is the only private information. In particular, for “singlevalue CAs”, where each player desires any one of several different bundles but has the same value for each of them, our technique converts any approximation algorithm to a dominant strategy mechanism that almost preserves the original approximation ratio. Our second result provides the first computationally efficient deterministic mechanism for the case of singlevalue multiminded bidders (with private value and private desired bundles). The mechanism achieves an approximation to the social welfare which is close to the best possible in polynomial time (unless P=NP). This mechanism is an algorithmic implementation in undominated strategies, a notion that we define and justify, and is of independent interest. 1
Limitations of VCGbased mechanisms
 In Proceedings of the 39th annual ACM symposium on Theory of computing
, 2007
"... We consider computationallyefficient incentivecompatible mechanisms that use the VCG payment scheme, and study how well they can approximate the social welfare in auction settings. We present a novel technique for setting lower bounds on the approximation ratio of this type of mechanisms. Specific ..."
Abstract

Cited by 24 (2 self)
 Add to MetaCart
(Show Context)
We consider computationallyefficient incentivecompatible mechanisms that use the VCG payment scheme, and study how well they can approximate the social welfare in auction settings. We present a novel technique for setting lower bounds on the approximation ratio of this type of mechanisms. Specifically, for combinatorial auctions among submodular (and thus also subadditive) bidders we prove an Ω(m 1 6) lower bound, which is close to the known upper bound of O(m 1 2), and qualitatively higher than the constant factor approximation possible from a purely computational point of view.
Sketching Valuation Functions
, 2011
"... Motivated by the problem of querying and communicating bidders ’ valuations in combinatorial auctions, we study how well different classes of set functions can be sketched. More formally, let f be a function mapping subsets of some ground set [n] to the nonnegative real numbers. We say that f ′ is ..."
Abstract

Cited by 23 (2 self)
 Add to MetaCart
(Show Context)
Motivated by the problem of querying and communicating bidders ’ valuations in combinatorial auctions, we study how well different classes of set functions can be sketched. More formally, let f be a function mapping subsets of some ground set [n] to the nonnegative real numbers. We say that f ′ is an αsketch of f if for every set S, the value f ′ (S) lies between f(S)/α and f(S), and f ′ can be specified by poly(n) bits. We show that for every subadditive function f there exists an αsketch where α = n 1/2 · O(polylog(n)). Furthermore, we provide an algorithm that finds these sketches with a polynomial number of demand queries. This is essentially the best we can hope for since: 1. We show that there exist subadditive functions (in fact, XOS functions) that do not admit an o(n 1/2) sketch. (Balcan and Harvey [3] previously showed that there exist functions belonging to the class of substitutes valuations that do not admit an O(n 1/3) sketch.) 2. We prove that every deterministic algorithm that accesses the function via value queries only cannot guarantee a sketching ratio better than n 1−ɛ. We also show that coverage functions, an interesting subclass of submodular functions, admit arbitrarily good sketches.
Buying Cheap is Expensive: Hardness of NonParametric MultiProduct Pricing
 ELECTRONIC COLLOQUIUM ON COMPUTATIONAL COMPLEXITY, REPORT NO. 68
, 2006
"... We investigate nonparametric unitdemand pricing problems, in which the goal is to find revenue maximizing prices for products P based on a set of consumer profiles C obtained, e.g., from an eCommerce website. A consumer profile consists of a number of nonzero budgets and a ranking of all the pro ..."
Abstract

Cited by 21 (6 self)
 Add to MetaCart
We investigate nonparametric unitdemand pricing problems, in which the goal is to find revenue maximizing prices for products P based on a set of consumer profiles C obtained, e.g., from an eCommerce website. A consumer profile consists of a number of nonzero budgets and a ranking of all the products the consumer is interested in. Once prices are fixed, each consumer chooses to buy one of the products she can afford based on some predefined selection rule. We distinguish between the minbuying, maxbuying, and rankbuying models. For the minbuying and general rankbuying models the best known approximation ratio is O(log C) and, previously, the problem was only known to be APXhard. We obtain the first (near) tight lower bound showing that the problem is not approximable within O(log ε C) for some ε> 0, unless NP ⊆ DTIME(n loglog n). Going to slightly stronger (still reasonable) complexity theoretic assumptions we prove inapproximability within O(ℓ ε) (ℓ being an upper bound on the number of nonzero budgets per consumer) and O(P  ε) and provide matching upper bounds. Surprisingly, these hardness results hold even if a price ladder constraint, i.e., a predefined total order on the prices of all products, is given. This changes if we require that in the rankbuying model consumers’ budgets are consistent with their
Bayesian Incentive Compatibility via Fractional Assignments
"... Very recently, Hartline and Lucier [14] studied singleparameter mechanism design problems in the Bayesian setting. They proposed a blackbox reduction that converted Bayesian approximation algorithms into BayesianIncentiveCompatible (BIC) mechanisms while preserving social welfare. It remains a ma ..."
Abstract

Cited by 20 (3 self)
 Add to MetaCart
(Show Context)
Very recently, Hartline and Lucier [14] studied singleparameter mechanism design problems in the Bayesian setting. They proposed a blackbox reduction that converted Bayesian approximation algorithms into BayesianIncentiveCompatible (BIC) mechanisms while preserving social welfare. It remains a major open question if one can find similar reduction in the more important multiparameter setting. In this paper, we give positive answer to this question when the prior distribution has finite and small support. We propose a blackbox reduction for designing BIC multiparameter mechanisms. The reduction converts any algorithm into an ɛBIC mechanism with only marginal loss in social welfare. As a result, for combinatorial auctions with subadditive agents we get an ɛBIC mechanism that achieves constant approximation. 1