• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Advanced Search Include Citations

Tools

Sorted by:
Try your query at:
Semantic Scholar Scholar Academic
Google Bing DBLP
Results 1 - 10 of 3,273
Next 10 →

On the entropy rate of pattern processes

by George M. Gemelos, Tsachy Weissman - Proceedings of the 2005 Data Compression Conference, Snowbird , 2005
"... We study the entropy rate of pattern sequences of stochastic processes, and its relationship to the entropy rate of the original process. We give a complete characterization of this relationship for i.i.d. processes over arbitrary alphabets, stationary ergodic processes over discrete alphabets, and ..."
Abstract - Cited by 11 (0 self) - Add to MetaCart
We study the entropy rate of pattern sequences of stochastic processes, and its relationship to the entropy rate of the original process. We give a complete characterization of this relationship for i.i.d. processes over arbitrary alphabets, stationary ergodic processes over discrete alphabets

Entropy rate superpixel segmentation

by Ming-yu Liu, Oncel Tuzel, Srikumar Ramalingam , 2011
"... We propose a new objective function for superpixel segmentation. This objective function consists of two components: entropy rate of a random walk on a graph and a balancing term. The entropy rate favors formation of compact and homogeneous clusters, while the balancing function encourages clusters ..."
Abstract - Cited by 33 (1 self) - Add to MetaCart
We propose a new objective function for superpixel segmentation. This objective function consists of two components: entropy rate of a random walk on a graph and a balancing term. The entropy rate favors formation of compact and homogeneous clusters, while the balancing function encourages clusters

On analytic properties of entropy rate

by Alexander Schönhuth , 2009
"... Entropy rate is a real valued functional on the space of discrete random sources for which it exists. However, it lacks existence proofs and/or closed formulas even for classes of random sources which have intuitive parameterizations. A good way to overcome this problem is to examine its analytic p ..."
Abstract - Cited by 6 (3 self) - Add to MetaCart
Entropy rate is a real valued functional on the space of discrete random sources for which it exists. However, it lacks existence proofs and/or closed formulas even for classes of random sources which have intuitive parameterizations. A good way to overcome this problem is to examine its analytic

Dimension, Entropy Rates, and Compression

by John M. Hitchcock, N. V. Vinodchandran
"... This paper develops new relationships between resource-bounded dimension, entropy rates,and compression. New tools for calculating dimensions are given and used to improve previous results about circuit-size complexity classes.Approximate counting of SpanP functions is used to prove that the NP-en ..."
Abstract - Add to MetaCart
This paper develops new relationships between resource-bounded dimension, entropy rates,and compression. New tools for calculating dimensions are given and used to improve previous results about circuit-size complexity classes.Approximate counting of SpanP functions is used to prove that the NP-entropy

Entropy Rate of Stochastic Processes

by Timo Mulder, Jorn Peters , 2015
"... The entropy rate of independent and identically distributed events can on average be encoded by H(X) bits per source symbol. However, in reality, series of events (or processes) are often randomly distributed and there can be arbitrary dependence between each event. Such processes with arbitrary dep ..."
Abstract - Add to MetaCart
The entropy rate of independent and identically distributed events can on average be encoded by H(X) bits per source symbol. However, in reality, series of events (or processes) are often randomly distributed and there can be arbitrary dependence between each event. Such processes with arbitrary

Entropy rates and finite-state dimension

by Chris Bourke, John M. Hitchcock, N. V. Vinodchandran - THEORETICAL COMPUTER SCIENCE , 2005
"... The effective fractal dimensions at the polynomial-space level and above can all be equivalently defined as the C-entropy rate where C is the class of languages corresponding to the level of effectivization. For example, pspace-dimension is equivalent to the PSPACE-entropy rate. At lower levels of c ..."
Abstract - Cited by 14 (1 self) - Add to MetaCart
The effective fractal dimensions at the polynomial-space level and above can all be equivalently defined as the C-entropy rate where C is the class of languages corresponding to the level of effectivization. For example, pspace-dimension is equivalent to the PSPACE-entropy rate. At lower levels

Entropy Rate Constancy in Text

by Dmitriy Genzel, Eugene Charniak - In Proceedings of ACL–2002 , 2002
"... We present a constancy rate principle governing language generation. We show that this principle implies that local measures of entropy (ignoring context) should increase with the sentence number. We demonstrate that this is indeed the case by measuring entropy in three different ways. We also show ..."
Abstract - Cited by 38 (2 self) - Add to MetaCart
We present a constancy rate principle governing language generation. We show that this principle implies that local measures of entropy (ignoring context) should increase with the sentence number. We demonstrate that this is indeed the case by measuring entropy in three different ways. We also show

On the Entropy Rate of Word-Valued Sources

by R. Timo, K. Blackmore, L. Hanlen
"... Abstract — A word-valued source Y is any discrete finite alphabet random process that is created by encoding a discrete random process X with a symbol-to-word function f. The first result of this paper solves an open problem by proving an existence theorem for the entropy rate of word valued sources ..."
Abstract - Cited by 2 (0 self) - Add to MetaCart
Abstract — A word-valued source Y is any discrete finite alphabet random process that is created by encoding a discrete random process X with a symbol-to-word function f. The first result of this paper solves an open problem by proving an existence theorem for the entropy rate of word valued

Entropy Rate Constancy in Text

by Dmitriy Genzel And, Dmitriy Genzel, Eugene Charniak - In Proceedings of ACL–2002 , 2002
"... We present a constancy rate principle governing language generation. We show that this principle implies that local measures of entropy (ignoring context) should increase with the sentence number. We demonstrate that this is indeed the case by measuring entropy in three di#erent ways. We als ..."
Abstract - Add to MetaCart
We present a constancy rate principle governing language generation. We show that this principle implies that local measures of entropy (ignoring context) should increase with the sentence number. We demonstrate that this is indeed the case by measuring entropy in three di#erent ways. We

Entropy Rate of Thermal Diffusion

by John Laurence , 2013
"... Copyright © 2013 John Laurence Haller Jr. This is an open access article distributed under the Creative Commons Attribution Li-cense, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. The thermal diffusion of a free particle i ..."
Abstract - Add to MetaCart
is a random process and generates entropy at a rate equal to twice the particle’s temperature, 2 BR k T (in natural units of information per second). The rate is calculated using a Gaussian process with a variance of 20x p t m which is a combination of quantum and classical diffusion
Next 10 →
Results 1 - 10 of 3,273
Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University