#### DMCA

## On Discrete Alphabets for the Two-user Gaussian Interference Channel with One Receiver Lacking Knowledge of the Interfering Codebook

Citations: | 3 - 3 self |

### Citations

444 | Gaussian Interference Channel Capacity to Within One Bit
- Etkin, Tse, et al.
- 2008
(Show Context)
Citation Context ...performance of the scheme in Theorem 6 at high-SNR by using the gDoF region as metric. For each rate Ri we define a gDoF di as in (9), for i ∈ [1 : 2], where we parameterize INR = SNRα for some α ≥ 0 =-=[11]-=-. In a spirit of Theorem 2, we take N = b √ 1 + SNRβc. With this, we have that the following achievable gDoF region Theorem 7. From Theorem 6, the following (d1, d2) pairs are achievable d1 ≤ { β if 1... |

375 |
Network Information Theory
- Gamal, Kim
- 2012
(Show Context)
Citation Context ... sharing sequence has a product distribution. Besides the restriction in (1) on the allowed class of codes, the probability of error, achievable rates and capacity region are defined in the usual way =-=[6]-=-. In this work we consider the practically relevant real-valued single-antenna symmetric Gaussian noise case. The restriction to symmetric channel gains is just for ease of exposition; all the results... |

238 |
The capacity of the Gaussian interference channel under strong interference
- Sato
- 1981
(Show Context)
Citation Context ... so that the following sum-rate is achievable for the G-IC-OR R1 +R2 ≤ log √1 + ( INR 1 + SNR )1− + 1 2 log(1 + SNR)− log ( e 2 ) − 2. In this regime, the classical G-IC has sum-capacity =-=[14]-=- R1 +R2 ≤ 12 log(1 + SNR+ INR). The gap is hence 1 2 log(1 + SNR+ INR)− log √1 + ( INR 1 + SNR )1− − 1 2 log(1 + SNR) + 2 + log ( e 2 ) ≤ [ 1 2 log ( 1 6 ln ( INR 1 + SNR ))]+ + 3 + log (... |

116 |
The worst additive noise under a covariance constraint
- Diggavi, Cover
- 2001
(Show Context)
Citation Context ...e was no interference, which is again highly desirable. In contrast, a Gaussian r.v. is considered to be the “best” input but the “worst” interference/noise when subject to a second moment constraint =-=[9]-=-. Consider the point-to-point Gaussian channel Y = √ SNR X + Z, (8a) E[X2] ≤ 1, Z ∼ N (z; 0, 1), (8b) whose capacity C = Ig (SNR) is achieved by X ∼ N (x; 0, 1) at all SNRs. For this channel the gDoF ... |

99 |
A case where interference does not reduce capacity
- Carleial
- 1975
(Show Context)
Citation Context ...7 so that the following rates are achievable for the G-IC-OR R1 ≤ log(b √ 1 + SNR1−c)− 1 2 log ( e 2 ) − 1, R2 ≤ 12 log(1 + SNR)− 1 2 log ( e 2 ) − 1. In this regime, the classical G-IC has capacity =-=[13]-=- R1 ≤ 12 log(1 + SNR), R2 ≤ 12 log(1 + SNR). Clearly, the gap for R2 is a constant (with respect to (SNR, INR)) given by 12 log ( e 2 ) + 1 = 1.2213; the gap for R1 is as in (15). Although the theorem... |

54 |
Bounds on the capacity region of a class of interference channels
- Telatar, Tse
- 2007
(Show Context)
Citation Context ...esired message, but not that of the interfering message). The capacity region of the IC-OR was characterized to within a constant gap for the class of injective semi-deterministic IC in the spirit of =-=[5]-=-. In particular, the capacity of the real-valued Gaussian IC-OR (G-IC-OR) was characterized to within 1/2 bit per channel use per user; however, the input distribution achieving such a gap was not fou... |

44 | Communication via decentralized processing
- Sanderovich, Shamai, et al.
- 2008
(Show Context)
Citation Context ...wledge and to nodes with only knowledge of a subset of the codebooks as oblivious receivers. A. Past Work To the best of our knowledge systems with partial codebook knowledge were first introduced in =-=[1]-=-. In [1] lack codebook knowledge was modeled by using codebook indices, which index the random encoding functions that map the messages to the codewords. If a node has codebook knowledge it knows the ... |

21 | MMSE dimension - Wu, Verdú - 2011 |

15 | The impact of constellation cardinality on Gaussian channel capacity
- Wu, Verdú
(Show Context)
Citation Context ...9) Consider now the performance of the input X ∼ PAM ( N, √ 12 N2 − 1 ) . (10) It was shown in [10, Th. 10] that for any fixed N independent of SNR the gDoF is zero. Similar conclusions were found in =-=[7]-=- by considering high-SNR approximations of the finite constellation capacity of the point-to-point Gaussian channel, defined as the maximum rate achieved by a discrete input constrained to have a fini... |

9 | On Codebook Information for Interference Relay Channels With Out-of-Band Relaying
- Simeone, Erkip, et al.
- 2011
(Show Context)
Citation Context ...om encoding function used; else it does not and the codewords essentially look like the symbols were produced in an independent, identically distributed (i.i.d.) fashion from a given distribution. In =-=[2]-=- and [3] this concept of partial codebook knowledge was extended to model oblivious relays, where only multi-letter capacity expressions were obtained. As pointed out in [2, Section III.A] and [3, Rem... |

5 | Relaying for multiple sources in the absence of codebook information
- Tian, Yener
- 2011
(Show Context)
Citation Context ...ing function used; else it does not and the codewords essentially look like the symbols were produced in an independent, identically distributed (i.i.d.) fashion from a given distribution. In [2] and =-=[3]-=- this concept of partial codebook knowledge was extended to model oblivious relays, where only multi-letter capacity expressions were obtained. As pointed out in [2, Section III.A] and [3, Remark 5], ... |

4 | High-SNR asymptotics of mutual information for discrete constellations
- Alvarado, Brännström, et al.
- 2013
(Show Context)
Citation Context ...w SNR, respectively, for a point-to-point powerconstrained Gaussian noise channel; [7, Theorem 8] gives a mutual information lower bound that holds for the Gauss quadrature distribution for all SNRs; =-=[8]-=- considers arbitrary input constellations with distribution independent of SNR and finds exact asymptotic expressions for the rate in the high-SNR limit. Here we can not use these results as we need f... |

2 | On the capacity of interference channels with partial codebook knowledge
- Dytso, Devroye, et al.
(Show Context)
Citation Context ... is not known how to find the optimal input distribution in general. In particular, the capacity achieving distribution for the practically relevant Gaussian noise channel remains an open problem. In =-=[4]-=- we introduced the two-user Interference Channel (IC) with one Oblivious Receiver, referred to as the IC-OR. In the IC-OR, one receiver has full codebook knowledge (as in the classical IC), but the ot... |

2 |
On the utility of discrete alphabets in gaussian interference channels
- Dytso, Tuninetti, et al.
- 2014
(Show Context)
Citation Context ...gime; indeed, in weak interference it is not optimal to set U2 = X2 in [4, Lemma 3] as we did in Theorem 6; different choices of discrete inputs for the general region in [4, Lemma 3] are reported in =-=[12]-=-. VII. FINITE SNR PERFORMANCE In the previous section we showed that in strong interference (α ≥ 1) the sum-gDoF of the classical G-IC can be approached with any precision even when one receiver lacks... |