Results 1  10
of
738
The Shannon Capacity of a union
 Combinatorica
, 1998
"... For an undirected graph G = (V; E), let G n denote the graph whose vertex set is V n in which two distinct vertices (u 1 ; u 2 ; : : : ; un ) and (v 1 ; v 2 ; : : : ; v n ) are adjacent iff for all i between 1 and n either u i = v i or u i v i 2 E. The Shannon capacity c(G) of G is the limit li ..."
Abstract

Cited by 34 (1 self)
 Add to MetaCart
For an undirected graph G = (V; E), let G n denote the graph whose vertex set is V n in which two distinct vertices (u 1 ; u 2 ; : : : ; un ) and (v 1 ; v 2 ; : : : ; v n ) are adjacent iff for all i between 1 and n either u i = v i or u i v i 2 E. The Shannon capacity c(G) of G is the limit
ON THE SHANNON CAPACITY OF DNA DATA EMBEDDING
"... This paper firstly gives a brief overview of information embedding in deoxyribonucleic acid (DNA) sequences and its applications. DNA data embedding can be considered as a particular case of communications with or without side information, depending on the use of coding or noncoding DNA sequences, ..."
Abstract
 Add to MetaCart
, respectively. Although several DNA data embedding methods have been proposed over the last decade, it is still an open question to determine the maximum amount of information that can theoretically be embedded —that is, its Shannon capacity. This is the main question tackled in this paper. Index Terms — Data
Improved lower bound on the Shannon capacity of . . .
, 2000
"... An independent set with 108 vertices in the strong product of four 7cycles (C7 2 \Theta C7 2 \Theta C7 2 \Theta C7 ) is given. This improves the best known lower bound for the Shannon capacity of the graph C7 which is the zeroerror capacity of the corresponding noisy channel. The search was done b ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
An independent set with 108 vertices in the strong product of four 7cycles (C7 2 \Theta C7 2 \Theta C7 2 \Theta C7 ) is given. This improves the best known lower bound for the Shannon capacity of the graph C7 which is the zeroerror capacity of the corresponding noisy channel. The search was done
Some remarks on the Shannon capacity of odd cycles
"... We tackle the problem of estimating the Shannon capacity of cycles of odd length. We present some strategies which allow us to find tight bounds on the Shannon capacity of cycles of various odd lengths, and suggest that the difficulty of obtaining a general result may be related to different beh ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
We tackle the problem of estimating the Shannon capacity of cycles of odd length. We present some strategies which allow us to find tight bounds on the Shannon capacity of cycles of various odd lengths, and suggest that the difficulty of obtaining a general result may be related to different
On the Shannon capacity and queueing stability of random access multicast
, 2008
"... We study and compare the Shannon capacity region and the stable throughput region for a random access system in which source nodes multicast their messages to multiple destination nodes. Under an erasure channel model which accounts for interference and allows for multipacket reception, we first cha ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
We study and compare the Shannon capacity region and the stable throughput region for a random access system in which source nodes multicast their messages to multiple destination nodes. Under an erasure channel model which accounts for interference and allows for multipacket reception, we first
On Shannon Capacity of Packet Based Communications
, 2002
"... A novel approach to evaluation of upper bounds on attainable information rates of coded packet based single and multiple access channels is presented. The succession of active and idle periods is modeled with a Markov process implying that additive noisecontaminated channel output is a Hidden Marko ..."
Abstract
 Add to MetaCart
Markov Process. The increase in capacity that might be achieved through exploitation of timing information is evaluated for packet based binary and Gaussian single access channel as well as for Gaussian bursty two user multipleaccess channel.
Entropy and the shannon capacity of queueing systems
 IEEE Workshop on Information Theory, Kruger, South Africa
, 1999
"... A number of results in the classical theory of point processes assert that the Poisson process is a “fixed point ” for the operations of (1) random splitting, (2) independent superposition and (3) random translation of points. That is, one may begin with a Poisson process (or a finite number of inde ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
A number of results in the classical theory of point processes assert that the Poisson process is a “fixed point ” for the operations of (1) random splitting, (2) independent superposition and (3) random translation of points. That is, one may begin with a Poisson process (or a finite number of independent Poisson processes) and as a result of performing each of the above operations obtain a Poisson process (or a finite number of independent Poisson processes). Classical results in queueing theory also assert that the Poisson process is a fixed point for various queueing systems in the following sense: If the arrival process to such a queueing system is a Poisson process, then so is the equilibrium departure process. Examples of such queueing systems include the firstcomefirstserved (FCFS) exponential server queue (the M/M/l queue), a queue which dispenses i.i.d. services with
On The Shannon Capacity Of ChaosBased Asynchronous CDMA Systems
"... We here compare the limit performance asynchronous DSCDMA systems based on different set of spreading sequences, namely chaosbased, ideal random, Gold codes and maximumlength codes. To do so we consider the Shannon capacity associated to each of those systems that is itself a random variable depe ..."
Abstract
 Add to MetaCart
We here compare the limit performance asynchronous DSCDMA systems based on different set of spreading sequences, namely chaosbased, ideal random, Gold codes and maximumlength codes. To do so we consider the Shannon capacity associated to each of those systems that is itself a random variable
On the Equivalence of Shannon Capacity and Stable Capacity in Networks with Memoryless Channels
"... Abstract—An equivalence result is established between the Shannon capacity and the stable capacity of communication networks. Given a discretetime network with memoryless, timeinvariant, discreteoutput channels, it is proved that the Shannon capacity equals the stable capacity. The results treat g ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
Abstract—An equivalence result is established between the Shannon capacity and the stable capacity of communication networks. Given a discretetime network with memoryless, timeinvariant, discreteoutput channels, it is proved that the Shannon capacity equals the stable capacity. The results treat
Is "Shannoncapacity of noisy computing" zero?
"... Abstract Towards understanding energy requirements for computation with noisy elements, we consider the computation of an arbitrary kinput, koutput binary invertible function. In our model, not all gates need to be noisy. However, the input nodes on the computation graph must be separated from th ..."
Abstract
 Add to MetaCart
the available resources (e.g. gates, wires, etc.) are used. I. INTRODUCTION The success of information theory in advancing reliable communication is in large part due to the surprising result of Shannon that noisy channels can have nonzero Shannoncapacity, i.e., using bounded transmit power, unboundedly low
Results 1  10
of
738