Results 1  10
of
3,330,186
1 Conditionally independent random variables
, 2005
"... Abstract — In this paper we investigate the notion of conditional independence and prove several information inequalities for conditionally independent random variables. Index Terms — Conditionally independent random variables, common information, rate region. ..."
Abstract
 Add to MetaCart
Abstract — In this paper we investigate the notion of conditional independence and prove several information inequalities for conditionally independent random variables. Index Terms — Conditionally independent random variables, common information, rate region.
Addition of freely independent Random variables
 J. Funct. Anal
, 1992
"... A direct proof is given of Voiculescu’s addition theorem for freely independent realvalued random variables, using resolvents of selfadjoint operators. In contrast to the original proof, no assumption is made on the existence of moments above the second. The concept of independent random variable ..."
Abstract

Cited by 63 (3 self)
 Add to MetaCart
A direct proof is given of Voiculescu’s addition theorem for freely independent realvalued random variables, using resolvents of selfadjoint operators. In contrast to the original proof, no assumption is made on the existence of moments above the second. The concept of independent random
Inequalities for sums of independent random variables
, 1988
"... ABSTRACT. A moment inequality is proved for sums of independent random variables in the Lorentz spaces LPtq, thus extending an inequality of Rosenthal. The latter result is used in combination with a square function inequality to give a proof of a Banach space isomorphism theorem. Further moment ine ..."
Abstract

Cited by 9 (1 self)
 Add to MetaCart
ABSTRACT. A moment inequality is proved for sums of independent random variables in the Lorentz spaces LPtq, thus extending an inequality of Rosenthal. The latter result is used in combination with a square function inequality to give a proof of a Banach space isomorphism theorem. Further moment
INDEPENDENT RANDOM VARIABLES IN LORENTZ SPACES
"... We investigate random variables in Lorentz spaces L. Conditions on the characteristic function are obtained which imply that a random variable belongs to the Lorentz space. Using them, we prove some estimates for the Lp 9norm of sums of independent random variables. Some of these estimates are new ..."
Abstract
 Add to MetaCart
We investigate random variables in Lorentz spaces L. Conditions on the characteristic function are obtained which imply that a random variable belongs to the Lorentz space. Using them, we prove some estimates for the Lp 9norm of sums of independent random variables. Some of these estimates are new
Huffman algebras for independent random variables
 IBM RC
, 1994
"... Based on a rearrangement inequality by Hardy, Littlewood and Polya, we define twooperator algebras for independent random variables. These algebras are called Huffman algebras since the Huffman algorithm on these algebras produces an optimal binary tree that minimizes the weighted lengths of leaves ..."
Abstract

Cited by 10 (0 self)
 Add to MetaCart
Based on a rearrangement inequality by Hardy, Littlewood and Polya, we define twooperator algebras for independent random variables. These algebras are called Huffman algebras since the Huffman algorithm on these algebras produces an optimal binary tree that minimizes the weighted lengths
Simple Constructions of Almost kwise Independent Random Variables
, 1992
"... We present three alternative simple constructions of small probability spaces on n bits for which any k bits are almost independent. The number of bits used to specify a point in the sample space is (2 + o(1))(log log n + k/2 + log k + log 1 ɛ), where ɛ is the statistical difference between the dist ..."
Abstract

Cited by 319 (42 self)
 Add to MetaCart
We present three alternative simple constructions of small probability spaces on n bits for which any k bits are almost independent. The number of bits used to specify a point in the sample space is (2 + o(1))(log log n + k/2 + log k + log 1 ɛ), where ɛ is the statistical difference between
On the entropy of the sum and of the difference of independent random variables
 in Proc. IEEE Conven. Elect. Electron. Eng. Israel (IEEEI
, 2008
"... We show that the entropy of the sum of independent random variables can greatly differ from the entropy of their difference. The gap between the two entropies can be arbitrarily large. This holds for regular entropies as well as differential entropies. Our results rely heavily on a result of Ruzsa, ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
We show that the entropy of the sum of independent random variables can greatly differ from the entropy of their difference. The gap between the two entropies can be arbitrarily large. This holds for regular entropies as well as differential entropies. Our results rely heavily on a result of Ruzsa
Concentration and Moment Inequalities for Polynomials of Independent Random Variables
"... In this work we design a general method for proving moment inequalities for polynomials of independent random variables. Our method works for a wide range of random variables including Gaussian, Boolean, exponential, Poisson and many others. We apply our method to derive general concentration inequa ..."
Abstract

Cited by 8 (2 self)
 Add to MetaCart
In this work we design a general method for proving moment inequalities for polynomials of independent random variables. Our method works for a wide range of random variables including Gaussian, Boolean, exponential, Poisson and many others. We apply our method to derive general concentration
Paths of a Continuum of Independent Random Variables
, 2014
"... We outline the basic method of analysis, similar to Judd (1985), that allows to show that for the paths of a continuum of independent — not necessarily i.i.d. — random variables certain properties can be assumed. In particular, we find that the assumption of a continuum of independent random variabl ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
We outline the basic method of analysis, similar to Judd (1985), that allows to show that for the paths of a continuum of independent — not necessarily i.i.d. — random variables certain properties can be assumed. In particular, we find that the assumption of a continuum of independent random
Measuring the Magnitude of Sums of Independent Random Variables
, 2001
"... This paper considers how to measure the magnitude of the sum of independent random variables in several ways. We give a formula for the tail distribution for sequences that satisfy the so called Levy property. We then give a connection between the tail distribution and the pth moment, and between th ..."
Abstract

Cited by 14 (1 self)
 Add to MetaCart
This paper considers how to measure the magnitude of the sum of independent random variables in several ways. We give a formula for the tail distribution for sequences that satisfy the so called Levy property. We then give a connection between the tail distribution and the pth moment, and between
Results 1  10
of
3,330,186