Results 1 - 10
of
3,273
On the entropy rate of pattern processes
- Proceedings of the 2005 Data Compression Conference, Snowbird
, 2005
"... We study the entropy rate of pattern sequences of stochastic processes, and its relationship to the entropy rate of the original process. We give a complete characterization of this relationship for i.i.d. processes over arbitrary alphabets, stationary ergodic processes over discrete alphabets, and ..."
Abstract
-
Cited by 11 (0 self)
- Add to MetaCart
We study the entropy rate of pattern sequences of stochastic processes, and its relationship to the entropy rate of the original process. We give a complete characterization of this relationship for i.i.d. processes over arbitrary alphabets, stationary ergodic processes over discrete alphabets
Entropy rate superpixel segmentation
, 2011
"... We propose a new objective function for superpixel segmentation. This objective function consists of two components: entropy rate of a random walk on a graph and a balancing term. The entropy rate favors formation of compact and homogeneous clusters, while the balancing function encourages clusters ..."
Abstract
-
Cited by 33 (1 self)
- Add to MetaCart
We propose a new objective function for superpixel segmentation. This objective function consists of two components: entropy rate of a random walk on a graph and a balancing term. The entropy rate favors formation of compact and homogeneous clusters, while the balancing function encourages clusters
On analytic properties of entropy rate
, 2009
"... Entropy rate is a real valued functional on the space of discrete random sources for which it exists. However, it lacks existence proofs and/or closed formulas even for classes of random sources which have intuitive parameterizations. A good way to overcome this problem is to examine its analytic p ..."
Abstract
-
Cited by 6 (3 self)
- Add to MetaCart
Entropy rate is a real valued functional on the space of discrete random sources for which it exists. However, it lacks existence proofs and/or closed formulas even for classes of random sources which have intuitive parameterizations. A good way to overcome this problem is to examine its analytic
Dimension, Entropy Rates, and Compression
"... This paper develops new relationships between resource-bounded dimension, entropy rates,and compression. New tools for calculating dimensions are given and used to improve previous results about circuit-size complexity classes.Approximate counting of SpanP functions is used to prove that the NP-en ..."
Abstract
- Add to MetaCart
This paper develops new relationships between resource-bounded dimension, entropy rates,and compression. New tools for calculating dimensions are given and used to improve previous results about circuit-size complexity classes.Approximate counting of SpanP functions is used to prove that the NP-entropy
Entropy Rate of Stochastic Processes
, 2015
"... The entropy rate of independent and identically distributed events can on average be encoded by H(X) bits per source symbol. However, in reality, series of events (or processes) are often randomly distributed and there can be arbitrary dependence between each event. Such processes with arbitrary dep ..."
Abstract
- Add to MetaCart
The entropy rate of independent and identically distributed events can on average be encoded by H(X) bits per source symbol. However, in reality, series of events (or processes) are often randomly distributed and there can be arbitrary dependence between each event. Such processes with arbitrary
Entropy rates and finite-state dimension
- THEORETICAL COMPUTER SCIENCE
, 2005
"... The effective fractal dimensions at the polynomial-space level and above can all be equivalently defined as the C-entropy rate where C is the class of languages corresponding to the level of effectivization. For example, pspace-dimension is equivalent to the PSPACE-entropy rate. At lower levels of c ..."
Abstract
-
Cited by 14 (1 self)
- Add to MetaCart
The effective fractal dimensions at the polynomial-space level and above can all be equivalently defined as the C-entropy rate where C is the class of languages corresponding to the level of effectivization. For example, pspace-dimension is equivalent to the PSPACE-entropy rate. At lower levels
Entropy Rate Constancy in Text
- In Proceedings of ACL–2002
, 2002
"... We present a constancy rate principle governing language generation. We show that this principle implies that local measures of entropy (ignoring context) should increase with the sentence number. We demonstrate that this is indeed the case by measuring entropy in three different ways. We also show ..."
Abstract
-
Cited by 38 (2 self)
- Add to MetaCart
We present a constancy rate principle governing language generation. We show that this principle implies that local measures of entropy (ignoring context) should increase with the sentence number. We demonstrate that this is indeed the case by measuring entropy in three different ways. We also show
On the Entropy Rate of Word-Valued Sources
"... Abstract — A word-valued source Y is any discrete finite alphabet random process that is created by encoding a discrete random process X with a symbol-to-word function f. The first result of this paper solves an open problem by proving an existence theorem for the entropy rate of word valued sources ..."
Abstract
-
Cited by 2 (0 self)
- Add to MetaCart
Abstract — A word-valued source Y is any discrete finite alphabet random process that is created by encoding a discrete random process X with a symbol-to-word function f. The first result of this paper solves an open problem by proving an existence theorem for the entropy rate of word valued
Entropy Rate Constancy in Text
- In Proceedings of ACL–2002
, 2002
"... We present a constancy rate principle governing language generation. We show that this principle implies that local measures of entropy (ignoring context) should increase with the sentence number. We demonstrate that this is indeed the case by measuring entropy in three di#erent ways. We als ..."
Abstract
- Add to MetaCart
We present a constancy rate principle governing language generation. We show that this principle implies that local measures of entropy (ignoring context) should increase with the sentence number. We demonstrate that this is indeed the case by measuring entropy in three di#erent ways. We
Entropy Rate of Thermal Diffusion
, 2013
"... Copyright © 2013 John Laurence Haller Jr. This is an open access article distributed under the Creative Commons Attribution Li-cense, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. The thermal diffusion of a free particle i ..."
Abstract
- Add to MetaCart
is a random process and generates entropy at a rate equal to twice the particle’s temperature, 2 BR k T (in natural units of information per second). The rate is calculated using a Gaussian process with a variance of 20x p t m which is a combination of quantum and classical diffusion
Results 1 - 10
of
3,273