Results 1 
7 of
7
Improving the spacebounded version of Muchnik’s conditional complexity theorem via “naive” derandomization, 2010, http://arxiv.org/abs/1009.5108 (Online version of this paper
"... Abstract. Many theorems about Kolmogorov complexity rely on existence of combinatorial objects with specific properties. Usually the probabilistic method gives such objects with better parameters than explicit constructions do. But the probabilistic method does not give “effective” variants of suc ..."
Abstract

Cited by 7 (1 self)
 Add to MetaCart
Abstract. Many theorems about Kolmogorov complexity rely on existence of combinatorial objects with specific properties. Usually the probabilistic method gives such objects with better parameters than explicit constructions do. But the probabilistic method does not give “effective” variants of such theorems, i.e. variants for resourcebounded Kolmogorov complexity. We show that a “naive derandomization ” approach of replacing these objects by the output of NisanWigderson pseudorandom generator may give polynomialspace variants of such theorems. Specifically, we improve the preceding polynomialspace analogue of Muchnik’s conditional complexity theorem. I.e., for all a and b there exists a program p of least possible length that transforms a to b and is simple conditional on b. Here all programs work in polynomial space and all complexities are measured with logarithmic accuracy instead of polylogarithmic one in the previous work. 1
Variations on Muchnik’s Conditional Complexity Theorem ⋆
"... Abstract. Muchnik’s theorem about simple conditional descriptions states that for all strings a and b there exists a short program p transforming a to b that has the least possible length and is simple conditional on b. In this paper we present two new proofs of this theorem. The first one is based ..."
Abstract

Cited by 6 (2 self)
 Add to MetaCart
(Show Context)
Abstract. Muchnik’s theorem about simple conditional descriptions states that for all strings a and b there exists a short program p transforming a to b that has the least possible length and is simple conditional on b. In this paper we present two new proofs of this theorem. The first one is based on the online matching algorithm for bipartite graphs. The second one, based on extractors, can be generalized to prove a version of Muchnik’s theorem for spacebounded Kolmogorov complexity. 1 Muchnik’s Theorem An. Muchnik [8] has proven the following theorem: Theorem 1. Let a and b be two binary strings, C (a) <nand C (ab) <k.Then there exists a string p such that • C (ap, b) =O(log n); • C (p) ≤ k + O(log n); • C (pa) =O(log n). Here C (u) stands for Kolmogorov complexity of string u (the length of a shortest program generating u); conditional complexity of u given v (the length of a shortest program that translates v to u) is denoted by C (uv), see [5]. The constants hidden in O(log n) do not depend on n, k, a, b, p. Informally, this theorem says that there exists a program p that transforms b to a, has the minimal possible complexity C (ab) (up to a logarithmic term) and, moreover, can be easily obtained from a. (The last requirement is crucial, otherwise the statement becomes a trivial reformulation of the definition of conditional Kolmogorov complexity.) This theorem is an algorithmic counterpart of Slepian–Wolf theorem [12] in multisource information theory. Assume that somebody (S) knowsb and wants Supported by ANR Sycomore, NAFIT ANR08EMER00801 and RFBR 090100709a grants.
Reconstructive dispersers and hitting set generators
 In APPROXRANDOM
, 2005
"... Abstract. We give a generic construction of an optimal hitting set generator (HSG) from any good “reconstructive ” disperser. Past constructions of optimal HSGs have been based on such disperser constructions, but have had to modify the construction in a complicated way to meet the stringent efficie ..."
Abstract

Cited by 6 (3 self)
 Add to MetaCart
(Show Context)
Abstract. We give a generic construction of an optimal hitting set generator (HSG) from any good “reconstructive ” disperser. Past constructions of optimal HSGs have been based on such disperser constructions, but have had to modify the construction in a complicated way to meet the stringent efficiency requirements of HSGs. The construction in this paper uses existing disperser constructions with the “easiest ” parameter setting in a blackbox fashion to give new constructions of optimal HSGs without any additional complications. Our results show that a straightforward composition of the NisanWigderson pseudorandom generator that is similar to the composition in works by Impagliazzo, Shaltiel and Wigderson in fact yields optimal HSGs (in contrast to the “nearoptimal ” HSGs constructed in those works). Our results also give optimal HSGs that do not use any form of hardness amplification or implicit listdecoding – like Trevisan’s extractor, the only ingredients are combinatorial designs and any good listdecodable errorcorrecting code. 1
Compression of samplable sources
 In IEEE Conference on Computational Complexity
, 2004
"... Abstract. We study the compression of polynomially samplable sources. In particular, we give efficient prefixfree compression and decompression algorithms for three classes of such sources (whose support is a subset of {0,1} n). 1. We show how to compress sources X samplable by logspace machines to ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
Abstract. We study the compression of polynomially samplable sources. In particular, we give efficient prefixfree compression and decompression algorithms for three classes of such sources (whose support is a subset of {0,1} n). 1. We show how to compress sources X samplable by logspace machines to expected length H(X) + O(1). Our next results concern flat sources whose support is in P. 2. If H(X) ≤ k = n − O(log n), we show how to compress to length k + polylog(n − k). 3. If the support of X is the witness set for a selfreducible NP relation, then we show how to compress to expected length H(X)+5.
Improving the SpaceBounded Version of Muchnik’s Conditional Complexity Theorem via
, 1009
"... Abstract. Many theorems about Kolmogorov complexity rely on existence of combinatorial objects with specific properties. Usually the probabilistic method gives such objects with better parameters than explicit constructions do. But the probabilistic method does not give “effective” variants of such ..."
Abstract
 Add to MetaCart
Abstract. Many theorems about Kolmogorov complexity rely on existence of combinatorial objects with specific properties. Usually the probabilistic method gives such objects with better parameters than explicit constructions do. But the probabilistic method does not give “effective” variants of such theorems, i.e. variants for resourcebounded Kolmogorov complexity. We show that a “naive derandomization ” approach of replacing these objects by the output of NisanWigderson pseudorandom generator may give polynomialspace variants of such theorems. Specifically,weimprovetheprecedingpolynomialspace analogueofMuchnik’s conditional complexity theorem. I.e., for all a and b there exists a program p of least possible length that transforms a to b and is simple conditional on b. Here all programs work in polynomial space and all complexities are measured with logarithmic accuracy instead of polylogarithmic one in the previous work. 1
On the optimal compression of sets in PSPACE
"... Abstract. We show that if DTIME[2 O(n) ] is not included in DSPACE[2 o(n)], then, for every set B in PSPACE, all strings x in B of length n can be represented by a string compressed(x) of length at most log(B =n ) + O(log n), such that a polynomialtime algorithm, given compressed(x), can distingu ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract. We show that if DTIME[2 O(n) ] is not included in DSPACE[2 o(n)], then, for every set B in PSPACE, all strings x in B of length n can be represented by a string compressed(x) of length at most log(B =n ) + O(log n), such that a polynomialtime algorithm, given compressed(x), can distinguish x from all the other strings in B =n. Modulo the O(log n) additive term, this achieves the informationtheoretical optimum for string compression.