Results 1  10
of
2,884
Mutually independent commitments
 Lecture Notes in Computer Science
, 2001
"... Abstract. We study the twoparty commitment problem, where two players have secret values they wish to commit to each other. Traditional commitment schemes cannot be used here because they do not guarantee independence of the committed values. We present three increasingly strong definitions of inde ..."
Abstract

Cited by 7 (1 self)
 Add to MetaCart
Abstract. We study the twoparty commitment problem, where two players have secret values they wish to commit to each other. Traditional commitment schemes cannot be used here because they do not guarantee independence of the committed values. We present three increasingly strong definitions
The Almost Equivalence of Pairwise and Mutual Independence and the Duality With Exchangeability
 Fields
, 1998
"... . For a large collection of random variables in an ideal setting, pairwise independence is shown to be almost equivalent to mutual independence. An asymptotic interpretation of this fact shows the equivalence of asymptotic pairwise independence and asymptotic mutual independence for a triangular arr ..."
Abstract

Cited by 15 (12 self)
 Add to MetaCart
. For a large collection of random variables in an ideal setting, pairwise independence is shown to be almost equivalent to mutual independence. An asymptotic interpretation of this fact shows the equivalence of asymptotic pairwise independence and asymptotic mutual independence for a triangular
Mutually Independent Hamiltonian Paths in Star Networks
"... Let P1 = 〈v1,v2, ···,vk 〉 and P2 = 〈u1,u2, ···,uk 〉 be any two hamiltonian paths of G. We say that P1 and P2 are independent if u1 = v1, vk = uk, and vi = ui for 1 <i<k. We say a set of hamiltonian paths P1,P2, ···,Ps of G are mutually independent if any two different hamiltonian paths are in ..."
Abstract
 Add to MetaCart
Let P1 = 〈v1,v2, ···,vk 〉 and P2 = 〈u1,u2, ···,uk 〉 be any two hamiltonian paths of G. We say that P1 and P2 are independent if u1 = v1, vk = uk, and vi = ui for 1 <i<k. We say a set of hamiltonian paths P1,P2, ···,Ps of G are mutually independent if any two different hamiltonian paths
Mutually Independent Hamiltonian Cycles of Pancake Networks
"... A hamiltonian cycle C of G is described as 〈u1, u2,...,u n(G), u1 〉 to emphasize the order of nodes in C. Thus, u1 is the beginning node and ui is the ith node in C. Two hamiltonian cycles of G beginning at a node x, C1 = 〈u1, u2,..., u n(G), u1〉 and C2 = 〈v1, v2,..., v n(G), v1〉, are independent i ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
if x = u1 = v1, and ui = vi for every 2 ≤ i ≤ n(G). A set of hamiltonian cycles {C1, C2,...,Ck} of G are mutually independent if any two different hamiltonian cycles are independent. The mutually independent hamiltonicity of graph G, IHC(G), is the maximum integer k such that for any node u of G
Mutually independent Hamiltonian cycles in dualcubes
, 2010
"... The hypercube family Qn is one of the most wellknown interconnection networks in parallel computers. With Qn, dualcube networks, denoted by DCn, was introduced and shown to be a (n + 1)regular, vertex symmetric graph with some faulttolerant Hamiltonian properties. In addition, DCnâs are shown ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
s are shown to be superior to Qnâs in many aspects. In this article, we will prove that the ndimensional dualcube DCn contains n+1 mutually independent Hamiltonian cycles for n â¥ 2. More specifically, let vi â V (DCn) for 0 â¤ i â¤ V (DCn)â1 and let ãv0, v1,..., vV (DCn)â1, v0ã be a
Blind Signal Separation: Statistical Principles
, 2003
"... Blind signal separation (BSS) and independent component analysis (ICA) are emerging techniques of array processing and data analysis, aiming at recovering unobserved signals or `sources' from observed mixtures (typically, the output of an array of sensors), exploiting only the assumption of mut ..."
Abstract

Cited by 529 (4 self)
 Add to MetaCart
of mutual independence between the signals. The weakness of the assumptions makes it a powerful approach but requires to venture beyond familiar second order statistics. The objective of this paper is to review some of the approaches that have been recently developed to address this exciting problem
Kernel independent component analysis
 Journal of Machine Learning Research
, 2002
"... We present a class of algorithms for independent component analysis (ICA) which use contrast functions based on canonical correlations in a reproducing kernel Hilbert space. On the one hand, we show that our contrast functions are related to mutual information and have desirable mathematical propert ..."
Abstract

Cited by 464 (24 self)
 Add to MetaCart
We present a class of algorithms for independent component analysis (ICA) which use contrast functions based on canonical correlations in a reproducing kernel Hilbert space. On the one hand, we show that our contrast functions are related to mutual information and have desirable mathematical
Fast and robust fixedpoint algorithms for independent component analysis
 IEEE TRANS. NEURAL NETW
, 1999
"... Independent component analysis (ICA) is a statistical method for transforming an observed multidimensional random vector into components that are statistically as independent from each other as possible. In this paper, we use a combination of two different approaches for linear ICA: Comon’s informat ..."
Abstract

Cited by 884 (34 self)
 Add to MetaCart
Independent component analysis (ICA) is a statistical method for transforming an observed multidimensional random vector into components that are statistically as independent from each other as possible. In this paper, we use a combination of two different approaches for linear ICA: Comon’s
Mutually Independent Hamiltonian Cycles of Double Loop Graphs
 THE 29TH WORKSHOP ON COMBINATORIAL MATHEMATICS AND COMPUTATION THEORY
"... ..."
Results 1  10
of
2,884