Results 1  10
of
1,898,967
Information Theory and Statistics
, 1968
"... Entropy and relative entropy are proposed as features extracted from symbol sequences. Firstly, a proper Iterated Function System is driven by the sequence, producing a fractaMike representation (CSR) with a low computational cost. Then, two entropic measures are applied to the CSR histogram of th ..."
Abstract

Cited by 1761 (2 self)
 Add to MetaCart
Entropy and relative entropy are proposed as features extracted from symbol sequences. Firstly, a proper Iterated Function System is driven by the sequence, producing a fractaMike representation (CSR) with a low computational cost. Then, two entropic measures are applied to the CSR histogram of the CSR and theoretically justified. Examples are included.
Algorithmic information theory
 IBM JOURNAL OF RESEARCH AND DEVELOPMENT
, 1977
"... This paper reviews algorithmic information theory, which is an attempt to apply informationtheoretic and probabilistic ideas to recursive function theory. Typical concerns in this approach are, for example, the number of bits of information required to specify an algorithm, or the probability that ..."
Abstract

Cited by 399 (20 self)
 Add to MetaCart
This paper reviews algorithmic information theory, which is an attempt to apply informationtheoretic and probabilistic ideas to recursive function theory. Typical concerns in this approach are, for example, the number of bits of information required to specify an algorithm, or the probability
A Theory of Program Size Formally Identical to Information Theory
, 1975
"... A new definition of programsize complexity is made. H(A;B=C;D) is defined to be the size in bits of the shortest selfdelimiting program for calculating strings A and B if one is given a minimalsize selfdelimiting program for calculating strings C and D. This differs from previous definitions: (1) ..."
Abstract

Cited by 402 (17 self)
 Add to MetaCart
concept of information theory. For example, H(A;B) = H(A) + H(B=A) + O(1). Also, if a program of length k is assigned measure 2 \Gammak , then H(A) = \Gamma log 2 (the probability that the standard universal computer will calculate A) +O(1). Key Words and Phrases: computational complexity, entropy
A Network Information Theory for Wireless Communication: Scaling Laws and Optimal Operation
 IEEE Transactions on Information Theory
, 2002
"... How much information can be carried over a wireless network with a multiplicity of nodes? What are the optimal strategies for information transmission and cooperation among the nodes? We obtain sharp information theoretic scaling laws under some conditions. ..."
Abstract

Cited by 369 (18 self)
 Add to MetaCart
How much information can be carried over a wireless network with a multiplicity of nodes? What are the optimal strategies for information transmission and cooperation among the nodes? We obtain sharp information theoretic scaling laws under some conditions.
Some informational aspects of visual perception
 Psychol. Rev
, 1954
"... The ideas of information theory are at present stimulating many different areas of psychological inquiry. In providing techniques for quantifying situations which have hitherto been difficult or impossible to quantify, they suggest new and more precise ways of conceptualizing these situations (see M ..."
Abstract

Cited by 628 (2 self)
 Add to MetaCart
The ideas of information theory are at present stimulating many different areas of psychological inquiry. In providing techniques for quantifying situations which have hitherto been difficult or impossible to quantify, they suggest new and more precise ways of conceptualizing these situations (see
Graph Theory
 MATHEMATISCHES FORSCHUNGSINSTITUT OBERWOLFACH REPORT NO. 16/2007
, 2007
"... This week broadly targeted both finite and infinite graph theory, as well as matroids, including their interaction with other areas of pure mathematics. The talks were complemented by informal workshops focussing on specific problems or particularly active areas. ..."
Abstract

Cited by 1182 (5 self)
 Add to MetaCart
This week broadly targeted both finite and infinite graph theory, as well as matroids, including their interaction with other areas of pure mathematics. The talks were complemented by informal workshops focussing on specific problems or particularly active areas.
Multimodality Image Registration by Maximization of Mutual Information
 IEEE TRANSACTIONS ON MEDICAL IMAGING
, 1997
"... A new approach to the problem of multimodality medical image registration is proposed, using a basic concept from information theory, mutual information (MI), or relative entropy, as a new matching criterion. The method presented in this paper applies MI to measure the statistical dependence or in ..."
Abstract

Cited by 777 (9 self)
 Add to MetaCart
A new approach to the problem of multimodality medical image registration is proposed, using a basic concept from information theory, mutual information (MI), or relative entropy, as a new matching criterion. The method presented in this paper applies MI to measure the statistical dependence
Results 1  10
of
1,898,967