Results 1  10
of
10
Selling privacy at auction. In:
 Proceedings of the 12th ACM Conference on Electronic Commerce.
, 2011
"... ABSTRACT We initiate the study of markets for private data, through the lens of differential privacy. Although the purchase and sale of private data has already begun on a large scale, a theory of privacy as a commodity is missing. In this paper, we propose to build such a theory. Specifically, we ..."
Abstract

Cited by 51 (12 self)
 Add to MetaCart
ABSTRACT We initiate the study of markets for private data, through the lens of differential privacy. Although the purchase and sale of private data has already begun on a large scale, a theory of privacy as a commodity is missing. In this paper, we propose to build such a theory. Specifically, we consider a setting in which a data analyst wishes to buy information from a population from which he can estimate some statistic. The analyst wishes to obtain an accurate estimate cheaply, while the owners of the private data experience some cost for their loss of privacy, and must be compensated for this loss. Agents are selfish, and wish to maximize their profit, so our goal is to design truthful mechanisms. Our main result is that such problems can naturally be viewed and optimally solved as variants of multiunit procurement auctions. Based on this result, we derive auctions which are optimal up to small constant factors for two natural settings: 1. When the data analyst has a fixed accuracy goal, we show that an application of the classic Vickrey auction achieves the analyst's accuracy goal while minimizing his total payment. 2. When the data analyst has a fixed budget, we give a mechanism which maximizes the accuracy of the resulting estimate while guaranteeing that the resulting sum payments do not exceed the analyst's budget. In both cases, our comparison class is the set of envyfree mechanisms, which correspond to the natural class of fixedprice mechanisms in our setting. In both of these results, we ignore the privacy cost due to possible correlations between an individual's private data and his valuation for privacy itself. We then show that generically, no individually rational mechanism can compensate individuals for the privacy loss incurred due to their reported valuations for privacy. This is nevertheless an important issue, and modeling it correctly is one of the many exciting directions for future work.
The differential privacy frontier (extended abstract
 In TCC
, 2009
"... Abstract. We review the definition of differential privacy and briefly survey a handful of very recent contributions to the differential privacy frontier. 1 Background Differential privacy is a strong privacy guarantee for an individual’s input to a (randomized) function or sequence of functions, wh ..."
Abstract

Cited by 30 (0 self)
 Add to MetaCart
(Show Context)
Abstract. We review the definition of differential privacy and briefly survey a handful of very recent contributions to the differential privacy frontier. 1 Background Differential privacy is a strong privacy guarantee for an individual’s input to a (randomized) function or sequence of functions, which we call a privacy mechanism. Informally, the guarantee says that the behavior of the mechanism is essentially unchanged independent of whether any individual opts into or opts out of the data set. Designed for statistical analysis, for example, of health or census data, the definition protects the privacy of individuals, and small groups of individuals, while permitting very different outcomes in the case of very different data sets. We begin by recalling some differential privacy basics. While the frontier of a vibrant area is always in flux, we will endeavor to give an impression of the state of the art by surveying a handful of extremely recent advances
Optimal Lower Bounds for Universal and Differentially Private Steiner Trees and TSPs
"... Given a metric space on n points, an αapproximate universal algorithm for the Steiner tree problem outputs a distribution over rooted spanning trees such that for any subset X of vertices containing the root, the expected cost of the induced subtree is within an α factor of the optimal Steiner tree ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
(Show Context)
Given a metric space on n points, an αapproximate universal algorithm for the Steiner tree problem outputs a distribution over rooted spanning trees such that for any subset X of vertices containing the root, the expected cost of the induced subtree is within an α factor of the optimal Steiner tree cost for X. An αapproximate differentially private algorithm for the Steiner tree problem takes as input a subset X of vertices, and outputs a tree distribution that induces a solution within an α factor of the optimal as before, and satisfies the additional property that for any set X ′ that differs in a single vertex from X, the tree distributions for X and X ′ are “close” to each other. Universal and differentially private algorithms for TSP are defined similarly. An αapproximate universal algorithm for the Steiner tree problem or TSP is also an αapproximate differentially private algorithm. It is known that both problems admit O(log n)approximate universal algorithms, and hence O(log n)approximate differentially private algorithms as well. We prove an Ω(log n) lower bound on the approximation ratio achievable for the universal Steiner tree problem and the universal TSP, matching the known upper bounds. Our lower bound for the Steiner tree problem holds even when the algorithm is allowed to output a more general
New Algorithms for Preserving Differential Privacy
, 2010
"... as representing the official policies, either expressed or implied, of any sponsoring institution, the U.S. ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
as representing the official policies, either expressed or implied, of any sponsoring institution, the U.S.
PrivacyIntegrated Graph Clustering Through Differential Privacy
"... ABSTRACT Data mining tasks like graph clustering can automatically process a large amount of data and retrieve valuable information. However, publishing such graph clustering results also involves privacy risks. In particular, linking the result with available background knowledge can disclose priv ..."
Abstract
 Add to MetaCart
(Show Context)
ABSTRACT Data mining tasks like graph clustering can automatically process a large amount of data and retrieve valuable information. However, publishing such graph clustering results also involves privacy risks. In particular, linking the result with available background knowledge can disclose private information of the data set. The strong privacy guarantees of the differential privacy model allow coping with the arbitrarily large background knowledge of a potential adversary. As current definitions of neighboring graphs do not fulfill the needs of graph clustering results, this paper proposes a new one. Furthermore, this paper proposes a graph clustering approach that guarantees 1edgedifferential privacy for its results. Besides giving strong privacy guarantees, our approach is able to calculate usable results. Those guarantees are ensured by perturbing the input graph. We have thoroughly evaluated our approach on synthetic data as well as on realworld graphs.
CS787: Advanced Algorithms Topic: Differentially Private Approximation Algorithms Presenter(s): Balasubramanian Sivan,
"... Instead of stating a new problem and presenting related results, we revisit existing problems and consider their privacy implications. Consider the following problems: • Assign people using a social network such as Facebook to one of two servers so that most pairs of friends are assigned to the same ..."
Abstract
 Add to MetaCart
(Show Context)
Instead of stating a new problem and presenting related results, we revisit existing problems and consider their privacy implications. Consider the following problems: • Assign people using a social network such as Facebook to one of two servers so that most pairs of friends are assigned to the same server. • Open a certain number of HIV treatment centers so that the average commute time for
Thesis Proposal: New Algorithms for Preserving Differential Privacy
, 2009
"... In this thesis, we will consider the problem of how one should perform computations on private data. We will specifically consider algorithms which preserve the recent formalization of privacy known as differential privacy. The fundamental tradeoff that we consider is that of privacy and utility. Fo ..."
Abstract
 Add to MetaCart
In this thesis, we will consider the problem of how one should perform computations on private data. We will specifically consider algorithms which preserve the recent formalization of privacy known as differential privacy. The fundamental tradeoff that we consider is that of privacy and utility. For which tasks can we perform useful computations while still preserving privacy, and what exactly is the tradeoff between usefulness and privacy? Finally, we will also consider the intriguing connections between the fields of differential privacy and game theory. 1
Research Statement
, 2015
"... My research focuses on problems in differential privacy, mechanism design, learning theory and increasingly, on the surprisingly rich intersection of these fields. These areas are bound together not only by a common set of techniques, but also by the underlying problems that they wish to solve. For ..."
Abstract
 Add to MetaCart
My research focuses on problems in differential privacy, mechanism design, learning theory and increasingly, on the surprisingly rich intersection of these fields. These areas are bound together not only by a common set of techniques, but also by the underlying problems that they wish to solve. For example, privacy and mechanism design are both concerned about the now common task of computing on data that may not