Results 1  10
of
19
Some Applications of Coding Theory in Computational Complexity
, 2004
"... Errorcorrecting codes and related combinatorial constructs play an important role in several recent (and old) results in computational complexity theory. In this paper we survey results on locallytestable and locallydecodable errorcorrecting codes, and their applications to complexity theory ..."
Abstract

Cited by 69 (2 self)
 Add to MetaCart
Errorcorrecting codes and related combinatorial constructs play an important role in several recent (and old) results in computational complexity theory. In this paper we survey results on locallytestable and locallydecodable errorcorrecting codes, and their applications to complexity theory and to cryptography.
Correcting errors without leaking partial information
 In 37th Annual ACM Symposium on Theory of Computing (STOC
, 2005
"... This paper explores what kinds of information two parties must communicate in order to correct errors which occur in a shared secret string W. Any bits they communicate must leak a significant amount of information about W — that is, from the adversary’s point of view, the entropy of W will drop sig ..."
Abstract

Cited by 66 (9 self)
 Add to MetaCart
(Show Context)
This paper explores what kinds of information two parties must communicate in order to correct errors which occur in a shared secret string W. Any bits they communicate must leak a significant amount of information about W — that is, from the adversary’s point of view, the entropy of W will drop significantly. Nevertheless, we construct schemes with which Alice and Bob can prevent an adversary from learning any useful information about W. Specifically, if the entropy of W is sufficiently high, then there is no function f(W) which the adversary can learn from the errorcorrection information with significant probability. This leads to several new results: (a) the design of noisetolerant “perfectly oneway” hash functions in the sense of Canetti et al. [7], which in turn leads to obfuscation of proximity queries for high entropy secrets W; (b) private fuzzy extractors [11], which allow one to extract uniformly random bits from noisy and nonuniform data W, while also insuring that no sensitive information about W is leaked; and (c) noise tolerance and stateless key reuse in the Bounded Storage Model, resolving the main open problem of Ding [10]. The heart of our constructions is the design of strong randomness extractors with the property that the source W can be recovered from the extracted randomness and any string W ′ which is close to W.
Entropic Security and the Encryption of High Entropy Messages
"... We study entropic security, an informationtheoretic notion of security introduced by Russell and Wang [24] in the context of encryption and by Canetti et al. [5, 6] in the context of hash functions. Informally, a probabilitic map Y = E(X) (e.g., an encryption sheme or a hash function) is entropica ..."
Abstract

Cited by 31 (7 self)
 Add to MetaCart
We study entropic security, an informationtheoretic notion of security introduced by Russell and Wang [24] in the context of encryption and by Canetti et al. [5, 6] in the context of hash functions. Informally, a probabilitic map Y = E(X) (e.g., an encryption sheme or a hash function) is entropically secure if knowledge of Y does not help predicting any predicate of X, whenever X has high minentropy from the adversaryâs point of view. On one hand, we strengthen the formulation of [5, 6, 24] and show that entropic security in fact implies that Y does not help predicting any function of X (as opposed to a predicate), bringing this notion closer to the conventioonal notion of semantic security [10]. On the other hand, we also show that entropic security is equivalent to indistinguishability on pairs of input distributions of sufficiently high entropy, which is in turn related to randomness extraction from nonuniform distributions [21]. We then use the equivalence above, and the connection to randomness extraction, to prove several new results on entropicallysecure encryption. First, we give two general frameworks for constructing entropically secure encryption schemes: one based on expander graphs and the other on XORuniversal hash functions. These schemes generalize the schemes of Russell and Wang, yielding simpler constructions and proofs, as well as improved parameters. To encrypt an nbit message of minentropy t while allowing at most Éadvantage to the adversary, our best schemes use a shared secret key of length k = n â t + 2 log () 1. Second, we obtain lower
Hardness of Approximation of the Balanced Complete Bipartite Subgraph Problem
, 2004
"... We prove that the Maximum Balanced Complete Bipartite Subgraph (BCBS) problem is (log n)δ hard to approximate within � a factor of 2 for some δ>0 under the plausible assumption that 3SAT � ∈ DTIME 2n3/4+ɛ � for some ɛ>0. WealsoshowthatitisNPhard to approximate the BCBS problem within a const ..."
Abstract

Cited by 24 (0 self)
 Add to MetaCart
(Show Context)
We prove that the Maximum Balanced Complete Bipartite Subgraph (BCBS) problem is (log n)δ hard to approximate within � a factor of 2 for some δ>0 under the plausible assumption that 3SAT � ∈ DTIME 2n3/4+ɛ � for some ɛ>0. WealsoshowthatitisNPhard to approximate the BCBS problem within a constant factor under the assumption that it is NPhard to approximate the maximum clique problem within a factor of n/2c√lg n for some small enough c>0. Furthermore we show that the same hardness of approximation results holds for the Maximum Edge Biclique problem. 1 Introduction and
Small PseudoRandom Families of Matrices: Derandomizing Approximate Quantum Encryption
 IN PROCEEDING OF RANDOM 2004
, 2004
"... A quantum encryption scheme (also called private quantum channel, or state randomization protocol) is a onetime pad for quantum messages. If two parties share a classical random string, one of them can transmit a quantum state to the other so that an eavesdropper gets little or no information ab ..."
Abstract

Cited by 24 (0 self)
 Add to MetaCart
A quantum encryption scheme (also called private quantum channel, or state randomization protocol) is a onetime pad for quantum messages. If two parties share a classical random string, one of them can transmit a quantum state to the other so that an eavesdropper gets little or no information about the state being transmitted. Perfect encryption schemes leak no information at all about the message. Approximate encryption schemes leak a nonzero (though small) amount of information but require a shorter shared random key. Approximate schemes with short keys have been shown to have a number of applications in quantum cryptography and information theory [8]. This paper
Sound 3query PCPPs are long
, 2008
"... We initiate the study of the tradeoff between the length of a probabilistically checkable proof of proximity (PCPP) and the maximal soundness that can be guaranteed by a 3query verifier with oracle access to the proof. Our main observation is that a verifier limited to querying a short proof cannot ..."
Abstract

Cited by 10 (3 self)
 Add to MetaCart
We initiate the study of the tradeoff between the length of a probabilistically checkable proof of proximity (PCPP) and the maximal soundness that can be guaranteed by a 3query verifier with oracle access to the proof. Our main observation is that a verifier limited to querying a short proof cannot obtain the same soundness as that obtained by a verifier querying a long proof. Moreover, we quantify the soundness deficiency as a function of the prooflength and show that any verifier obtaining “best possible” soundness must query an exponentially long proof. In terms of techniques, we focus on the special class of inspective verifiers that read at most 2 proofbits per invocation. For such verifiers we prove exponential lengthsoundness tradeoffs that are later on used to imply our main results for the case of general (i.e., not necessarily inspective) verifiers. To prove the exponential tradeoff for inspective verifiers we show a connection between PCPP proof length and propertytesting query complexity, that may be of independent interest. The connection is that any linear property that can be verified with proofs of length ℓ by linear inspective verifiers must be testable with query complexity ≈ log ℓ.
Locally testable cyclic codes
 IEEE Transactions on Information Theory
, 2003
"... Cyclic linear codes of block length n over a finite field Fq are linear subspaces of F n q that are invariant under a cyclic shift of their coordinates. A family of codes is good if all the codes in the family have constant rate and constant normalized distance (distance divided by block length). It ..."
Abstract

Cited by 7 (0 self)
 Add to MetaCart
(Show Context)
Cyclic linear codes of block length n over a finite field Fq are linear subspaces of F n q that are invariant under a cyclic shift of their coordinates. A family of codes is good if all the codes in the family have constant rate and constant normalized distance (distance divided by block length). It is a longstanding open problem whether there exists a good family of cyclic linear codes (cf. [21, p.270]). A code C is rtestable if there exists a randomized algorithm which, given a word x ∈ F n q, adaptively selects r positions, checks the entries of x in the selected positions, and makes a decision (accept or reject x) based on the positions selected and the numbers found, such that (i) if x ∈ C then x is surely accepted; (ii) if dist(x, C) ≥ ɛn then x is probably rejected. (“dist ” refers to Hamming distance.) A family of codes is locally testable if all members of the family are rtestable for some constant r. This concept arose from holographic proofs/PCPs. Goldreich and Sudan [17] asked whether there exist good, locally testable families of codes. In this paper the intersection of the two questions stated is addressed.