Results 1  10
of
30
Joint blind source separation by multiset canonical correlation analysis
 IEEE Trans. Signal Processing
, 2009
"... Abstract—In this paper, we introduce a simple and effective scheme to achieve joint blind source separation (BSS) of multiple datasets using multiset canonical correlation analysis (MCCA) [J. R. Kettenring, “Canonical analysis of several sets of variables,” Biometrika, vol. 58, pp. 433–451, 1971]. ..."
Abstract

Cited by 27 (7 self)
 Add to MetaCart
Abstract—In this paper, we introduce a simple and effective scheme to achieve joint blind source separation (BSS) of multiple datasets using multiset canonical correlation analysis (MCCA) [J. R. Kettenring, “Canonical analysis of several sets of variables,” Biometrika, vol. 58, pp. 433–451, 1971]. We first propose a generative model of joint BSS based on the correlation of latent sources within and between datasets. We specify source separability conditions, and show that, when the conditions are satisfied, the group of corresponding sources from each dataset can be jointly extracted by MCCA through maximization of correlation among the extracted sources. We compare source separation performance of the MCCA scheme with other joint BSS methods and demonstrate the superior performance of the MCCA scheme in achieving joint BSS for a large number of datasets, group of corresponding sources with heterogeneous correlation values, and complexvalued sources with circular and noncircular distributions. We apply MCCA to analysis of functional magnetic resonance imaging (fMRI) data from multiple subjects and show its utility in estimating meaningful brain activations from a visuomotor task. Index Terms—Canonical correlation analysis, group analysis, independent component analysis, joint blind source separation. I.
Complex independent component analysis by entropy bound minimization
 IEEE Trans. on Circuits and Systems I
"... Abstract—We first present a new (differential) entropy estimator for complex random variables by approximating the entropy estimate using a numerically computed maximum entropy bound. The associated maximum entropy distributions belong to the class of weighted linear combinations and elliptical dist ..."
Abstract

Cited by 21 (10 self)
 Add to MetaCart
(Show Context)
Abstract—We first present a new (differential) entropy estimator for complex random variables by approximating the entropy estimate using a numerically computed maximum entropy bound. The associated maximum entropy distributions belong to the class of weighted linear combinations and elliptical distributions, and together, they provide a rich array of bivariate distributions for density matching. Next, we introduce a new complex independent component analysis (ICA) algorithm, complex ICA by entropybound minimization (complex ICAEBM), using this new entropy estimator and a line search optimization procedure. We present simulation results to demonstrate the superior separation performance and computational efficiency of complex ICAEBM in separation of complex sources that come from a wide range of bivariate distributions. Index Terms—Complex optimization, complex random variable, differential entropy, independent component analysis (ICA), neural networks, principle of maximum entropy. I.
Complex ICA using nonlinear functions
 IEEE Trans. Signal Process
, 2008
"... Abstract—We introduce a framework based on Wirtinger calculus for nonlinear complexvalued signal processing such that all computations can be directly carried out in the complex domain. The two main approaches for performing independent component analysis, maximum likelihood, and maximization of no ..."
Abstract

Cited by 20 (12 self)
 Add to MetaCart
(Show Context)
Abstract—We introduce a framework based on Wirtinger calculus for nonlinear complexvalued signal processing such that all computations can be directly carried out in the complex domain. The two main approaches for performing independent component analysis, maximum likelihood, and maximization of nonGaussianity—which are intimately related to each other—are studied using this framework. The main update rules for the two approaches are derived, their properties and density matching strategies are discussed along with numerical examples to highlight their relationships. Index Terms—Complex optimization, density matching, independent component analysis (ICA), maximumlikelihood estimation, negentropy maximization. I.
Adaptable nonlinearity for complex maximization of nongaussianity and a fixedpoint algorithm
 in Proc. IEEE Workshop on Machine Learning for Signal Processing (MLSP
, 2006
"... Complex maximization of nongaussianity (CMN) has been shown to provide reliable separation of both circular and noncircular sources using a class of complex functions in the nonlinearity. In this paper, we derive a fixedpoint algorithm for blind separation of noncircular sources using CMN. We also ..."
Abstract

Cited by 19 (12 self)
 Add to MetaCart
(Show Context)
Complex maximization of nongaussianity (CMN) has been shown to provide reliable separation of both circular and noncircular sources using a class of complex functions in the nonlinearity. In this paper, we derive a fixedpoint algorithm for blind separation of noncircular sources using CMN. We also introduce the adaptive CMN (ACMN) algorithm that provides significant performance improvement by adapting the nonlinearity to the source distribution. The ability of ACMN to adapt to a wide range of source statistics is demonstrated by simulation results. 1.
The maximum likelihood approach to complex ICA
 in Proc. ICASSP
, 2006
"... We derive the form of the best nonlinear functions for performing independent component analysis (ICA) by maximum likelihood estimation. We show that both the form of nonlinearity and the relative gradient update equations for likelihood maximization naturally generalize to the complex case, and th ..."
Abstract

Cited by 18 (11 self)
 Add to MetaCart
(Show Context)
We derive the form of the best nonlinear functions for performing independent component analysis (ICA) by maximum likelihood estimation. We show that both the form of nonlinearity and the relative gradient update equations for likelihood maximization naturally generalize to the complex case, and that they coincide with the real case. We discuss several special cases for the score function as well as adaptive scores. 1.
Complex ICA by negentropy maximization
 IEEE Trans. Neural Netw
, 2008
"... Abstract—In this paper, we use complex analytic functions to achieve independent component analysis (ICA) by maximization of nonGaussianity and introduce the complex maximization of nonGaussianity (CMN) algorithm. We derive both a gradient–descent and a quasiNewton algorithm that use the full sec ..."
Abstract

Cited by 15 (3 self)
 Add to MetaCart
(Show Context)
Abstract—In this paper, we use complex analytic functions to achieve independent component analysis (ICA) by maximization of nonGaussianity and introduce the complex maximization of nonGaussianity (CMN) algorithm. We derive both a gradient–descent and a quasiNewton algorithm that use the full secondorder statistics providing superior performance with circular and noncircular sources as compared to existing methods. We show the connection among ICA methods through maximization of nonGaussianity, mutual information, and maximum likelihood (ML) for the complex case, and emphasize the importance of density matching for all three cases. Local stability conditions are derived for the CMN cost function that explicitly show the effects of noncircularity on convergence and demonstrated through simulation examples. Index Terms—Complexvalued data, independent component analysis (ICA), quasiNewton algorithm. I.
ICA by maximization of nongaussianity using complex functions
 in Proc. MLSP
, 2005
"... We use complex, hence analytic, functions to achieve independent component analysis (ICA) by maximization of nongaussianity and introduce the complex maximization of nongaussianity (CMN) algorithm. We show that CMN converges to the principal component of the source distribution and that the algorith ..."
Abstract

Cited by 13 (9 self)
 Add to MetaCart
(Show Context)
We use complex, hence analytic, functions to achieve independent component analysis (ICA) by maximization of nongaussianity and introduce the complex maximization of nongaussianity (CMN) algorithm. We show that CMN converges to the principal component of the source distribution and that the algorithm provides robust performance for both circular and noncircular sources. 1.
A class of complex ICA algorithms based on the kurtosis cost function
 IEEE Trans. Neural Netw
, 2008
"... Abstract—In this paper, we introduce a novel way of performing realvalued optimization in the complex domain. This framework enables a direct complex optimization technique when the cost function satisfies the Brandwood’s independent analyticity condition. In particular, this technique has been use ..."
Abstract

Cited by 12 (3 self)
 Add to MetaCart
Abstract—In this paper, we introduce a novel way of performing realvalued optimization in the complex domain. This framework enables a direct complex optimization technique when the cost function satisfies the Brandwood’s independent analyticity condition. In particular, this technique has been used to derive three algorithms, namely, kurtosis maximization using gradient update (KMG), kurtosis maximization using fixedpoint update (KMF), and kurtosis maximization using Newton update (KMN), to perform the complex independent component analysis (ICA) based on the maximization of the complex kurtosis cost function. The derivation and related analysis of the three algorithms are performed in the complex domain without using any complexreal mapping for differentiation and optimization. A general complex Newton rule is also derived for developing the KMN algorithm. The real conjugate gradient algorithm is extended to the complex domain similar to the derivation of complex Newton rule. The simulation results indicate that the fixedpoint version (KMF) and gradient version (KMG) are superior to other similar algorithms when the sources include both circular and noncircular distributions and the dimension is relatively high. Index Terms—Complex independent component analysis, complex Newton update, fixedpoint update. I.
Stability analysis of complex maximum likelihood ICA using Wirtinger calculus
 in Proc. ICASSP, Las Vegas, NV
, 2008
"... The desirable asymptotic optimality properties of the maximum likelihood (ML) estimator make it an attractive solution for performing independent component analysis (ICA) as well. Wirtinger calculus is shown to provide an attractive framework for the derivation and analysis of complexvalued algorit ..."
Abstract

Cited by 6 (4 self)
 Add to MetaCart
(Show Context)
The desirable asymptotic optimality properties of the maximum likelihood (ML) estimator make it an attractive solution for performing independent component analysis (ICA) as well. Wirtinger calculus is shown to provide an attractive framework for the derivation and analysis of complexvalued algorithms using nonlinear functions, and hence of ICA algorithms as well. Local stability analysis of complex ICA based on ML presents a unique challenge, since in addition to the need for computation of derivatives, the Hessian of a matrix quantity needs to be evaluated, and for the complex case, it assumes a signi�cantly more complicated form than the realvalued case. In this paper, we demonstrate how Wirtinger calculus allows the use of an elegant approach proposed by Amari et al. [5] in the analysis, thus enabling the derivation of the conditions for local stability of complex ML ICA. We further study the implications of the conditions for a generalized Gaussian density model.
Quaternionic independent component analysis using hypercomplex nonlinearities
 In 7th IMA Conference on Mathematics in Signal Processing
, 2006
"... ..."