Results 1  10
of
36
Compression of Graphical Structures: Fundamental Limits, Algorithms, and Experiments
, 2009
"... Information theory traditionally deals with “conventional data,” be it textual data, image, or video data. However, databases of various sorts have come into existence in recent years for storing “unconventional data” including biological data, social data, web data, topographical maps, and medical ..."
Abstract

Cited by 11 (5 self)
 Add to MetaCart
Information theory traditionally deals with “conventional data,” be it textual data, image, or video data. However, databases of various sorts have come into existence in recent years for storing “unconventional data” including biological data, social data, web data, topographical maps, and medical data. In compressing such data, one must consider two types of information: the information conveyed by the structure itself, and the information conveyed by the data labels implanted in the structure. In this paper, we attempt to address the former problem by studying information of graphical structures (i.e., unlabeled graphs). As the first step, we consider the ErdősRényi graphs G(n, p) over n vertices in which edges are added randomly with probability p. We prove that the structural entropy of G(n, p) is n
Minimum Expected Length of FixedtoVariable Lossless Compression of Memoryless Sources
"... Abstract—Conventional wisdom states that the minimum expected length for fixedtovariable length encoding of an nblock memoryless source with entropy H grows as nH+O(1). However, this performance is obtained under the constraint that the code assigned to the whole nblock is a prefix code. Droppin ..."
Abstract

Cited by 11 (5 self)
 Add to MetaCart
(Show Context)
Abstract—Conventional wisdom states that the minimum expected length for fixedtovariable length encoding of an nblock memoryless source with entropy H grows as nH+O(1). However, this performance is obtained under the constraint that the code assigned to the whole nblock is a prefix code. Dropping this unnecessary constraint we show that the minimum expected length grows as nH − 1 log n + O(1) 2 unless the source is equiprobable. I.
Optimal information storage: Nonsequential sources and neural channels
, 2006
"... Information storage and retrieval systems are communication systems from the present to the future and fall naturally into the framework of information theory. The goal of information storage is to preserve as much signal fidelity under resource constraints as possible. The information storage theor ..."
Abstract

Cited by 9 (6 self)
 Add to MetaCart
(Show Context)
Information storage and retrieval systems are communication systems from the present to the future and fall naturally into the framework of information theory. The goal of information storage is to preserve as much signal fidelity under resource constraints as possible. The information storage theorem delineates average fidelity and average resource values that are achievable and those that are not. Moreover, observable properties of optimal information storage systems and the robustness of optimal systems
Analytic Variations on Redundancy Rates of Renewal Processes
 IEEE Trans. Information Theory
, 2002
"... Csisz ar and Shields have recently proved that the minimax redundancy for a class of (stationary) renewal processes is ( n) where n is the block length. This interesting result provides a first nontrivial bound on redundancy for a nonparametric family of processes. The present paper gives a precis ..."
Abstract

Cited by 8 (5 self)
 Add to MetaCart
Csisz ar and Shields have recently proved that the minimax redundancy for a class of (stationary) renewal processes is ( n) where n is the block length. This interesting result provides a first nontrivial bound on redundancy for a nonparametric family of processes. The present paper gives a precise estimate of the redundancy rate for such (nonstationary) renewal sources, namely, 2 n +O(log n): This asymptotic expansion is derived by complexanalytic methods that include generating function representations, Mellin transforms, singularity analysis and saddle point estimates. This work places itself within the framework of analytic information theory.
Twophase cardinality estimation protocols for sensor networks with provable precision
 in IEEE Wireless Communications and Networking Conference
, 2012
"... AbstractEfficient cardinality estimation is a common requirement for many wireless sensor network (WSN) applications. The task must be accomplished at extremely low overhead due to severe sensor resource limitation. This poses an interesting challenge for largescale WSNs. In this paper we present ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
(Show Context)
AbstractEfficient cardinality estimation is a common requirement for many wireless sensor network (WSN) applications. The task must be accomplished at extremely low overhead due to severe sensor resource limitation. This poses an interesting challenge for largescale WSNs. In this paper we present a twophase probabilistic algorithm based on order statistics and Bernoulli scheme, which effectively estimates the cardinality of WSNs. We thoroughly examine properties of estimators used in each phase as well as the precision of the whole procedure. The algorithm discussed in this paper is a modification of a recently published idea the modification enables us to obtain a provable precision.
Discrete entropies of orthogonal polynomials
, 2008
"... Let pn, n ∈ N, be the nth orthonormal polynomial on R, whose zeros are λ (n) j, j = 1,...,n. Then for each j = 1,...,n, with ⃗Ψ 2 j Ψ 2 ij = p2 i−1 (λ(n) j) def ..."
Abstract

Cited by 4 (1 self)
 Add to MetaCart
(Show Context)
Let pn, n ∈ N, be the nth orthonormal polynomial on R, whose zeros are λ (n) j, j = 1,...,n. Then for each j = 1,...,n, with ⃗Ψ 2 j Ψ 2 ij = p2 i−1 (λ(n) j) def
Logconcavity, ultralogconcavity and a maximum entropy property of discrete compound Poisson measures
, 2009
"... Abstract Sufficient conditions are developed, under which the compound Poisson distribution has maximal entropy within a natural class of probability measures on the nonnegative integers. Recently, one of the authors [O. Johnson, Stoch. Proc. Appl., 2007] used a semigroup approach to show that the ..."
Abstract

Cited by 4 (1 self)
 Add to MetaCart
(Show Context)
Abstract Sufficient conditions are developed, under which the compound Poisson distribution has maximal entropy within a natural class of probability measures on the nonnegative integers. Recently, one of the authors [O. Johnson, Stoch. Proc. Appl., 2007] used a semigroup approach to show that the Poisson has maximal entropy among all ultralogconcave distributions with fixed mean. We show via a nontrivial extension of this semigroup approach that the natural analog of the Poisson maximum entropy property remains valid if the compound Poisson distributions under consideration are logconcave, but that it fails in general. A parallel maximum entropy result is established for the family of compound binomial measures. Sufficient conditions for compound distributions to be logconcave are discussed and applications to combinatorics are examined; new bounds are derived on the entropy of the cardinality of a random independent set in a clawfree graph, and a connection is drawn to Mason's conjecture for matroids. The present results are primarily motivated by the desire to provide an informationtheoretic foundation for compound Poisson approximation and associated limit theorems, analogous to the corresponding developments for the central limit theorem and for Poisson approximation. Our results also demonstrate new links between some probabilistic methods and the combinatorial notions of logconcavity and ultralogconcavity, and they add to the growing body of work exploring the applications of maximum entropy characterizations to problems in discrete mathematics.
OnetoOne Code and Its AntiRedundancy
 IEEE Trans. Information Theory
, 2008
"... Onetoone codes are “one shot ” codes that assign a distinct codeword to source symbols and are not necessarily prefix codes (more generally, uniquely decodable). Interestingly, as Wyner proved in 1972, for such codes the average code length can be smaller than the source entropy. By how much? We c ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
(Show Context)
Onetoone codes are “one shot ” codes that assign a distinct codeword to source symbols and are not necessarily prefix codes (more generally, uniquely decodable). Interestingly, as Wyner proved in 1972, for such codes the average code length can be smaller than the source entropy. By how much? We call this difference the antiredundancy. Various authors over the years have shown that the antiredundancy can be as big as minus the logarithm of the source entropy. However, to the best of our knowledge precise estimates do not exist. In this note, we consider a block code of length n generated for a binary memoryless source, and prove that the average antiredundancy is − 1 2 log 2 n + C + F(n) + o(1) where C is a constant and either F(n) = 0 if log 2 (1 − p)/p is irrational (where p is the probability of generating a “0”) or otherwise F(n) is a fluctuating function as the code length increases. This relatively simple finding requires a combination of analytic tools such as precise evaluation of Bernoulli sums, the saddle point method, and theory of distribution of sequences modulo 1.