Results 1  10
of
12
Property Testing Lower Bounds Via Communication Complexity
, 2011
"... We develop a new technique for proving lower bounds in property testing, by showing a strong connection between testing and communication complexity. We give a simple scheme for reducing communication problems to testing problems, thus allowing us to use known lower bounds in communication complexit ..."
Abstract

Cited by 34 (8 self)
 Add to MetaCart
(Show Context)
We develop a new technique for proving lower bounds in property testing, by showing a strong connection between testing and communication complexity. We give a simple scheme for reducing communication problems to testing problems, thus allowing us to use known lower bounds in communication complexity to prove lower bounds in testing. This scheme is general and implies a number of new testing bounds, as well as simpler proofs of several known bounds. For the problem of testing whether a boolean function is klinear (a parity function on k variables), we achieve a lower bound of Ω(k) queries, even for adaptive algorithms with twosided error, thus confirming a conjecture of Goldreich [25]. The same argument behind this lower bound also implies a new proof of known lower bounds for testing related classes such as kjuntas. For some classes, such as the class of monotone functions and the class of ssparse GF(2) polynomials, we significantly strengthen the best known bounds.
Testing Juntas Nearly Optimally
"... A function on n variables is called a kjunta if it depends on at most k of its variables. In this article, we show that it is possible to test whether a function is a kjunta or is “far” from being a kjunta with O(k/ɛ + k log k) queries, where ɛ is the approximation parameter. This result improves ..."
Abstract

Cited by 33 (6 self)
 Add to MetaCart
A function on n variables is called a kjunta if it depends on at most k of its variables. In this article, we show that it is possible to test whether a function is a kjunta or is “far” from being a kjunta with O(k/ɛ + k log k) queries, where ɛ is the approximation parameter. This result improves on the previous best upper bound of Õ(k3/2)/ɛ queries and is asymptotically optimal, up to a logarithmic factor. We obtain the improved upper bound by introducing a new algorithm with onesided error for testing juntas. Notably, the algorithm is a valid junta tester under very general conditions: it holds for functions with arbitrary finite domains and ranges, and it holds under any product distribution over the domain. A key component of the analysis of the new algorithm is a new structural result on juntas: roughly, we show that if a function f is “far ” from being a kjunta, then f is “far” from being determined by k parts in a random partition of the variables. The structural lemma is proved using the EfronStein decomposition method.
Testing Fourier dimensionality and sparsity
"... Abstract. We present a range of new results for testing properties of Boolean functions that are defined in terms of the Fourier spectrum. Broadly speaking, our results show that the property of a Boolean function having a concise Fourier representation is locally testable. We first give an efficien ..."
Abstract

Cited by 22 (3 self)
 Add to MetaCart
(Show Context)
Abstract. We present a range of new results for testing properties of Boolean functions that are defined in terms of the Fourier spectrum. Broadly speaking, our results show that the property of a Boolean function having a concise Fourier representation is locally testable. We first give an efficient algorithm for testing whether the Fourier spectrum of a Boolean function is supported in a lowdimensional subspace of F n 2 (equivalently, for testing whether f is a junta over a small number of parities). We next give an efficient algorithm for testing whether a Boolean function has a sparse Fourier spectrum (small number of nonzero coefficients). In both cases we also prove lower bounds showing that any testing algorithm — even an adaptive one — must have query complexity within a polynomial factor of our algorithms, which are nonadaptive. Finally, we give an “implicit learning ” algorithm that lets us test any subproperty of Fourier concision. Our technical contributions include new structural results about sparse Boolean functions and new analysis of the pairwise independent hashing of Fourier coefficients from [13]. 1
Lower bounds for testing computability by small width OBDDs
 IN PROC. 8TH ANNUAL THEORY AND APPLICATIONS OF MODELS OF COMPUTATION
, 2011
"... We consider the problem of testing whether a function f: {0, 1} n → {0, 1} is computable by a readonce, width2 ordered binary decision diagram (OBDD), also known as a branching program. This problem has two variants: one where the variables must occur in a fixed, known order, and one where the v ..."
Abstract

Cited by 6 (1 self)
 Add to MetaCart
(Show Context)
We consider the problem of testing whether a function f: {0, 1} n → {0, 1} is computable by a readonce, width2 ordered binary decision diagram (OBDD), also known as a branching program. This problem has two variants: one where the variables must occur in a fixed, known order, and one where the variables are allowed to occur in an arbitrary order. We show that for both variants, any nonadaptive testing algorithm must make Ω(n) queries, and thus any adaptive testing algorithm must make Ω(log n) queries. We also consider the more general problem of testing computability by widthw OBDDs where the variables occur in a fixed order. We show that for any constant w ≥ 4, Ω(n) queries are required, resolving a conjecture of Goldreich [15]. We prove all of our lower bounds using a new technique of Blais, Brody, and Matulef [6], giving simple reductions from known hard problems in communication complexity to the testing problems at hand. Our result for width2 OBDDs provides the first example of the power of this technique for proving strong nonadaptive bounds.
On Active and Passive Testing
, 2013
"... Given a property of Boolean functions, what is the minimum number of queries required to determine with high probability if an input function satisfies this property? This is a fundamental question in Property Testing, where traditionally the testing algorithm is allowed to pick its queries among t ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
(Show Context)
Given a property of Boolean functions, what is the minimum number of queries required to determine with high probability if an input function satisfies this property? This is a fundamental question in Property Testing, where traditionally the testing algorithm is allowed to pick its queries among the entire set of inputs. Balcan et al. have recently suggested to restrict the tester to take its queries from a smaller, typically random, subset of the inputs. This model is called active testing, in resemblance of active learning. Active testing gets more difficult as the size of the set we can query from decreases, and the extreme case is when it is exactly the number of queries we perform (so the algorithm actually has no choice). This is known as passive testing, or testing from random examples. In their paper, Balcan et al. have shown that active and passive testing of dictator functions is as hard as learning them, and requires Θ(logn) queries (unlike the classic model, in which it can be done in a constant number of queries). We extend this result to klinear functions, proving that passive and active testing of them requires Θ(k logn) queries, assuming k is not too large.
On Approximating the Number of Relevant Variables in a Function
, 2011
"... In this work we consider the problem of approximating the number of relevant variables in a function given query access to the function. Since obtaining a multiplicative factor approximation is hard in general, we consider several relaxations of the problem. In particular, we consider a relaxation o ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
(Show Context)
In this work we consider the problem of approximating the number of relevant variables in a function given query access to the function. Since obtaining a multiplicative factor approximation is hard in general, we consider several relaxations of the problem. In particular, we consider a relaxation of the property testing variant of the problem and we consider relaxations in which we have a promise that the function belongs to a certain family of functions (e.g., linear functions). In the former relaxation the task is to distinguish between the case that the number of relevant variables is at most k, and the case in which it is far from any function in which the number of relevant variable is more than (1 + γ)k for a parameter γ. We give both upper bounds and almost matching lower bounds for the relaxations we study. In many scientific endeavors, an important challenge is making sense of huge datasets. In particular, when trying to make sense of functional relationships we would like to know or estimate the number of variables that a function depends upon. This can be useful both as a preliminary process for machine learning and statistical inference and independently, as a measure of the complexity of the
Local Correction with Constant Error Rate
, 2012
"... A Boolean function f of n variables is said to be qlocally correctable if, given a blackbox access to a function g which is ”close” to an isomorphism fσ(x) = fσ(x1,...,xn) = f(x σ(1),...,x σ(n)) of f, we can compute fσ(x) for any x ∈ Z n 2 with good probability using q queries to g. It is known th ..."
Abstract
 Add to MetaCart
A Boolean function f of n variables is said to be qlocally correctable if, given a blackbox access to a function g which is ”close” to an isomorphism fσ(x) = fσ(x1,...,xn) = f(x σ(1),...,x σ(n)) of f, we can compute fσ(x) for any x ∈ Z n 2 with good probability using q queries to g. It is known that degree d polynomials are O(2 d)locally correctable, and that most kjuntas are O(k log k)locally correctable, where the closeness parameter, or more precisely the distance between g and fσ, is required to be exponentially small (in d and k respectively). In this work we relax the requirement for the closeness parameter by allowing the distance between the functions to be a constant. We first investigate the family of juntas, and show that almost every kjunta is O(k log² k)locally correctable for any distance ε < 0.001. A similar result is shown for the family of partially symmetric functions, that is functions which are indifferent to any reordering of all but a constant number of their variables. For both families, the algorithms provided here use nonadaptive queries and are applicable to most but not all functions of each family (as it is shown to be impossible to locally correct all of them). Our approach utilizes the measure of symmetric influence introduced in the recent analysis of testing partial symmetry of functions.
Testing Juntas
, 2010
"... A function on n variables is called a kjunta if it depends on at most k of its variables. The problem of testing whether a function is a kjunta or is “far” from being a kjunta is a central problem in property testing and is closely related to the problem of learning highdimensional data. In this ..."
Abstract
 Add to MetaCart
A function on n variables is called a kjunta if it depends on at most k of its variables. The problem of testing whether a function is a kjunta or is “far” from being a kjunta is a central problem in property testing and is closely related to the problem of learning highdimensional data. In this note, we give an informal presentation of three recent algorithms for testing juntas efficiently.