Results 1 
6 of
6
A learning theory approach to noninteractive database privacy
 In Proceedings of the 40th annual ACM symposium on Theory of computing
, 2008
"... In this paper we demonstrate that, ignoring computational constraints, it is possible to release synthetic databases that are useful for accurately answering large classes of queries while preserving differential privacy. Specifically, we give a mechanism that privately releases synthetic data usefu ..."
Abstract

Cited by 220 (25 self)
 Add to MetaCart
In this paper we demonstrate that, ignoring computational constraints, it is possible to release synthetic databases that are useful for accurately answering large classes of queries while preserving differential privacy. Specifically, we give a mechanism that privately releases synthetic data useful for answering a class of queries over a discrete domain with error that grows as a function of the size of the smallest net approximately representing the answers to that class of queries. We show that this in particular implies a mechanism for counting queries that gives error guarantees that grow only with the VCdimension of the class of queries, which itself grows at most logarithmically with the size of the query class. We also show that it is not possible to release even simple classes of queries (such as intervals and their generalizations) over continuous domains with worstcase utility guarantees while preserving differential privacy. In response to this, we consider a relaxation of the utility guarantee and give a privacy preserving polynomial time algorithm that for any halfspace query will provide an answer that is accurate for some small perturbation of the query. This algorithm does not release synthetic data, but instead another data structure capable of representing an answer for each query. We also give an efficient algorithm for releasing synthetic data for the class of interval queries and axisaligned rectangles of constant dimension over discrete domains. 1.
A SizeSensitive Discrepancy Bound for Set Systems of Bounded Primal Shatter Dimension
"... Let (X,S) be a set system on an npoint set X. The discrepancy of S is defined as the minimum of the largest deviation from an even split, over all subsets of S ∈ S and twocolorings χ on X. We consider the scenario where, for any subset X ′ ⊆ X of size m ≤ n and for any parameter 1 ≤ k ≤ m, the nu ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Let (X,S) be a set system on an npoint set X. The discrepancy of S is defined as the minimum of the largest deviation from an even split, over all subsets of S ∈ S and twocolorings χ on X. We consider the scenario where, for any subset X ′ ⊆ X of size m ≤ n and for any parameter 1 ≤ k ≤ m, the number of restrictions of the sets of S to X ′ of size at most k is only O(m d1 k d−d1), for fixed integers d> 0 and 1 ≤ d1 ≤ d (this generalizes the standard notion of bounded primal shatter dimension when d1 = d). In this case we show that there exists a coloring χ with discrepancy bound O ∗ (S  1/2−d1/(2d) n (d1−1)/(2d)), for each S ∈ S, where O ∗ (·) hides a polylogarithmic factor in n. This bound is tight up to a polylogarithmic factor [25, 27] and the corresponding coloring χ can be computed in expected polynomial time using the very recent machinery of Lovett and Meka for constructive discrepancy minimization [24]. Our bound improves and generalizes the bounds obtained from the machinery of HarPeled and Sharir [19] (and the followup work in [32]) for points and halfspaces in dspace for d ≥ 3. Last but not least, we show that our bound yields improved bounds for the size of relative (ε,δ)approximations for set systems of the above kind.
Conjecture 1 ([2]). For any function f: N → {−1, +1} and for any constant C,
"... Abstract. We show that the hereditary discrepancy of homogeneous arithmetic progressions is lower bounded by n 1/O(log log n). This bound is tight up to the constant in the exponent. ..."
Abstract
 Add to MetaCart
Abstract. We show that the hereditary discrepancy of homogeneous arithmetic progressions is lower bounded by n 1/O(log log n). This bound is tight up to the constant in the exponent.
The Composition Theorem for Differential Privacy Sewoong Oh ∗ and Pramod Viswanath†
"... Interactive querying of a database degrades the privacy level. In this paper we answer the fundamental question of characterizing the level of differential privacy degradation as a function of the number of adaptive interactions and the differential privacy levels maintained by the individual querie ..."
Abstract
 Add to MetaCart
Interactive querying of a database degrades the privacy level. In this paper we answer the fundamental question of characterizing the level of differential privacy degradation as a function of the number of adaptive interactions and the differential privacy levels maintained by the individual queries. Our solution is complete: the privacy degradation guarantee is true for every privacy mechanism and, further, we demonstrate a sequence of privacy mechanisms that do degrade in the characterized manner. The key innovation is the introduction of an operational interpretation (involving hypothesis testing) to differential privacy and the use of the corresponding data processing inequalities. Our result improves over the state of the art and has immediate applications to several problems studied in the literature. 1
The Composition Theorem for Differential Privacy Sewoong Oh ∗ and Pramod Viswanath†
"... Interactive querying of a database degrades the privacy level. In this paper we answer the fundamental question of characterizing the level of differential privacy degradation as a function of the number of adaptive interactions and the differential privacy levels maintained by the individual querie ..."
Abstract
 Add to MetaCart
Interactive querying of a database degrades the privacy level. In this paper we answer the fundamental question of characterizing the level of differential privacy degradation as a function of the number of adaptive interactions and the differential privacy levels maintained by the individual queries. Our solution is complete: the privacy degradation guarantee is true for every privacy mechanism and, further, we demonstrate a sequence of privacy mechanisms that do degrade in the characterized manner. The key innovation is the introduction of an operational interpretation (involving hypothesis testing) to differential privacy and the use of the corresponding data processing inequalities. Our result improves over the state of the art and has immediate applications to several problems studied in the literature. 1