Results 11  20
of
22
Buying private data at auction: the sensitive surveyor’s problem
 SIGecom Exch
, 2012
"... In this letter, we survey some recent work on what we call the sensitive surveyor’s problem. A curious data analyst wishes to survey a population to obtain an accurate estimate of a simple population statistic: for example, the fraction of the population testing positive for syphilis. However, beca ..."
Abstract

Cited by 5 (3 self)
 Add to MetaCart
In this letter, we survey some recent work on what we call the sensitive surveyor’s problem. A curious data analyst wishes to survey a population to obtain an accurate estimate of a simple population statistic: for example, the fraction of the population testing positive for syphilis. However, because this is a statistic over sensitive data, individuals experience a cost for participating in the survey as a function of their loss in privacy. Agents must be compensated for this cost, and moreover, are strategic agents and will misreport their cost if doing so is beneficial for them. The goal of the surveyor is to manage the inevitable tradeoff between the cost of the survey, and the accuracy of its results.
Distributed Private Heavy Hitters
, 2012
"... In this paper, we give efficient algorithms and lower bounds for solving the heavy hitters problem while preserving differential privacy in the fully distributed local model. In this model, there are n parties, each of which possesses a single element from a universe of size N. The heavy hitters pro ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
In this paper, we give efficient algorithms and lower bounds for solving the heavy hitters problem while preserving differential privacy in the fully distributed local model. In this model, there are n parties, each of which possesses a single element from a universe of size N. The heavy hitters problem is to find the identity of the most common element shared amongst the n parties. In the local model, there is no trusted database administrator, and so the algorithm must interact with each of the n parties separately, using a differentially private protocol. We give tight informationtheoretic upper and lower bounds on the accuracy to which this problem can be solved in the local model (giving a separation between the local model and the more common centralized model of privacy), as well as computationally efficient algorithms even in the case where the data universe N may be exponentially large. 1
Differentially private projected histograms: Construction and use for prediction
 of Lecture Notes in Computer Science
"... Abstract. Privacy concerns are among the major barriers to efficient secondary use of information and data on humans. Differential privacy is a relatively recent measure that has received much attention in machine learning as it quantifies individual risk using a strong cryptographically motivated n ..."
Abstract

Cited by 4 (1 self)
 Add to MetaCart
(Show Context)
Abstract. Privacy concerns are among the major barriers to efficient secondary use of information and data on humans. Differential privacy is a relatively recent measure that has received much attention in machine learning as it quantifies individual risk using a strong cryptographically motivated notion of privacy. At the core of differential privacy lies the concept of information dissemination through a randomized process. One way of adding the needed randomness to any process is to prerandomize the input. This can yield lower quality results than other more specialized approaches, but can be an attractive alternative when i. there does not exist a specialized differentially private alternative, or when ii. multiple processes applied in parallel can use the same prerandomized input. A simple way to do input randomization is to compute perturbed histograms, which essentially are noisy multiset membership functions. Unfortunately, computation of perturbed histograms is only efficient when the data stems from a lowdimensional discrete space. The restriction to discrete spaces can be mitigated by discretization; Lei presented in 2011 an analysis of discretization in the context of Mestimators. Here we address the restriction regarding the dimensionality of the data. In particular we present a differentially private approximation algorithm for selecting features that preserve conditional frequency densities, and use this to project data prior to computing differentially private histograms. The resulting projected histograms can be used as machine learning input and include the necessary randomness for differential privacy. We empirically validate the use of differentially private projected histograms for learning binary and multinomial logistic regression models using four real world data sets. 1
Differentially private spectrum auction with approximate revenue maximization
 In MobiHoc’14
, 2014
"... Dynamic spectrum redistribution—–under which spectrum owners lease out underutilized spectrum to users for financial gain—–is an effective way to improve spectrum utilization. Auction is a natural way to incentivize spectrum owners to share their idle resources. In recent years, a number of stra ..."
Abstract

Cited by 3 (3 self)
 Add to MetaCart
(Show Context)
Dynamic spectrum redistribution—–under which spectrum owners lease out underutilized spectrum to users for financial gain—–is an effective way to improve spectrum utilization. Auction is a natural way to incentivize spectrum owners to share their idle resources. In recent years, a number of strategyproof auction mechanisms have been proposed to stimulate bidders to truthfully reveal their valuations. However, it has been shown that truthfulness is not a necessary condition for revenue maximization. Furthermore, in most existing spectrum auction mechanisms, bidders may infer the valuations—–which are private information—–of the other bidders from the auction outcome. In this paper, we propose a Differentially privatE spectrum auction mechanism with Approximate Revenue maximization (DEAR). We theoretically prove that DEAR achieves approximate truthfulness, privacy preservation, and approximate revenue maximization. Our extensive evaluations show that DEAR achieves good performance in terms of both revenue and privacy preservation.
Differentially private and strategyproof spectrum auction with approximate revenue maximization
 in https://www.dropbox.com/s/z50284bvbgp2vrf/pass.pdf
, 2014
"... Abstract—The rapid growth of wireless mobile users and applications has led to high demand of spectrum. Auction is a powerful tool to improve the utilization of spectrum resource, and many auction mechanisms have been proposed thus far. However, none of them has considered both the privacy of bidder ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
(Show Context)
Abstract—The rapid growth of wireless mobile users and applications has led to high demand of spectrum. Auction is a powerful tool to improve the utilization of spectrum resource, and many auction mechanisms have been proposed thus far. However, none of them has considered both the privacy of bidders and the revenue gain of the auctioneer together. In this paper, we study the design of privacypreserving auction mechanisms: we first propose a differentially private auction mechanism which can achieve strategyproofness and a near optimal expected revenue based on the concept of virtual valuation. Assuming the knowledge of the bidders ’ valuation distributions, the near optimal differentially private and strategyproof auction mechanism uses the generalized VickreyClarkeGroves auction payment scheme to achieve high revenue with a high probability. To tackle its high computational complexity, we also propose an approximate differentially PrivAte, Strategyproof, and polynomially tractable Spectrum (PASS) auction mechanism that can achieve a suboptimal revenue. PASS uses a monotone allocation algorithm and the critical payment scheme to achieve strategyproofness. We also evaluate PASS extensively via simulation, showing that it can generate more revenue than existing mechanisms in the spectrum auction market. I.
Differentially private convex optimization with piecewise affine objectives
 In IEEE Conf. on Decision and Control
, 2014
"... Differential privacy is a recently proposed notion of privacy that provides strong privacy guarantees without any assumptions on the adversary. The paper studies the problem of computing a differentially private solution to convex optimization problems whose objective function is piecewise affine. ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
(Show Context)
Differential privacy is a recently proposed notion of privacy that provides strong privacy guarantees without any assumptions on the adversary. The paper studies the problem of computing a differentially private solution to convex optimization problems whose objective function is piecewise affine. Such problem is motivated by applications in which the affine functions that define the objective function contain sensitive user information. We propose several privacy preserving mechanisms and provide analysis on the tradeoffs between optimality and the level of privacy for these mechanisms. Numerical experiments are also presented to evaluate their performance in practice. 1
Recommended Citation
, 2010
"... an authorized administrator of TopSCHOLAR®. For more information, please contact connie.foster@wku.edu. ..."
Abstract
 Add to MetaCart
(Show Context)
an authorized administrator of TopSCHOLAR®. For more information, please contact connie.foster@wku.edu.
A SupermodularityBased Differential Privacy Preserving Algorithm for Data Anonymization
, 2014
"... Maximizing data usage and minimizing privacy risk are two conflicting goals. Organizations always apply a set of transformations on their data before releasing it. While determining the best set of transformations has been the focus of extensive work in the database community, most of this work suf ..."
Abstract
 Add to MetaCart
Maximizing data usage and minimizing privacy risk are two conflicting goals. Organizations always apply a set of transformations on their data before releasing it. While determining the best set of transformations has been the focus of extensive work in the database community, most of this work suffered from one or both of the following major problems: scalability and privacy guarantee. Differential Privacy provides a theoretical formulation for privacy that ensures that the system essentially behaves the same way regardless of whether any individual is included in the database. In this paper, we address both scalability and privacy risk of data anonymization. We propose a scalable algorithm that meets differential privacy when applying a specific random sampling. The contribution of the paper is twofold: 1) we propose a personalized anonymization technique based on an aggregate formulation and prove that it can be implemented in polynomial time; and 2) we show that combining the proposed aggregate formulation with specific sampling gives an anonymization algorithm that satisfies differential privacy. Our results rely heavily on exploring the supermodularity properties of the risk function, which allow us to employ techniques from convex optimization. Through experimental
For more information, please contact repository@pobox.upenn.edu. Take it or Leave it: Running a Survey when Privacy Comes at a Cost
, 1202
"... In this paper, we consider the problem of estimating a potentially sensitive (individually stigmatizing) statistic on a population. In our model, individuals are concerned about their privacy, and experience some cost as a function of their privacy loss. Nevertheless, they would be willing to partic ..."
Abstract
 Add to MetaCart
In this paper, we consider the problem of estimating a potentially sensitive (individually stigmatizing) statistic on a population. In our model, individuals are concerned about their privacy, and experience some cost as a function of their privacy loss. Nevertheless, they would be willing to participate in the survey if they were compensated for their privacy cost. These cost functions are not publicly known, however, nor do we make Bayesian assumptions about their form or distribution. Individuals are rational and will misreport their costs for privacy if doing so is in their best interest. Ghosh and Roth recently showed in this setting, when costs for privacy loss may be correlated with private types, if individuals value differential privacy, no individually rational direct revelation mechanism can compute any nontrivial estimate of the population statistic. In this paper, we circumvent this impossibility result by proposing a modified notion of how individuals experience cost as a function of their privacy loss, and by givingamechanismwhich doesnot operateby direct revelation. Instead, ourmechanismhasthe ability to randomly approach individuals from a population and offer them a takeitorleaveit offer. This is intended to model the abilities of a surveyor who may stand on a street corner and approach passersby. 1