#### DMCA

## Fast algorithms for Gaussian noise invariant independent component analysis (2013)

Venue: | Advances in Neural Information Processing Systems 26 |

Citations: | 3 - 3 self |

### Citations

883 | Fast and robust fixed-point algorithms for independent component analysis
- Hyvärinen
- 1999
(Show Context)
Citation Context ...s. In practice, most methods ignore the noise term, leaving the simpler problem of recovering the mixing matrix A when x = As is observed. Arguably the two most widely used ICA algorithms are FastICA =-=[13]-=- and JADE [6]. Both of these algorithms are based on a two step process: (1) The data is centered and whitened, that is, made to have identity covariance matrix. This is typically done using principal... |

846 | Independent component analysis: algorithms and applications. Neural networks
- Hyvärinen, Oja
- 2000
(Show Context)
Citation Context ...ive literature on ICA in the signal processing and machine learning communities due to its applicability to a variety of important practical situations. For a comprehensive introduction see the books =-=[8, 14]-=-. In this paper we develop techniques for dealing with noisy data by introducing new and more efficient techniques for quasi-orthogonalization and subsequent component recovery. The quasi-orthogonaliz... |

719 | Blind beamforming for non-Gaussian signals.
- Cardoso, Souloumiac
- 1993
(Show Context)
Citation Context ..., most methods ignore the noise term, leaving the simpler problem of recovering the mixing matrix A when x = As is observed. Arguably the two most widely used ICA algorithms are FastICA [13] and JADE =-=[6]-=-. Both of these algorithms are based on a two step process: (1) The data is centered and whitened, that is, made to have identity covariance matrix. This is typically done using principal component an... |

560 | Robust principal component analysis
- Candès, Li, et al.
- 2011
(Show Context)
Citation Context ...ethods for Gaussian noise-invariant ICA under the assumption that the noise parameters are known. Finally, there are several papers that considered the problem of performing PCA in a noisy framework. =-=[5]-=- gives a provably robust algorithm for PCA under a sparse noise model. [4] performs PCA robust to white Gaussian noise, and [9] performs PCA robust to white Gaussian noise and sparse noise. 2 Using Cu... |

95 | Variational principal components. In:
- Bishop
- 1999
(Show Context)
Citation Context ...e parameters are known. Finally, there are several papers that considered the problem of performing PCA in a noisy framework. [5] gives a provably robust algorithm for PCA under a sparse noise model. =-=[4]-=- performs PCA robust to white Gaussian noise, and [9] performs PCA robust to white Gaussian noise and sparse noise. 2 Using Cumulants to Orthogonalize the Independent Components Properties of Cumulant... |

49 | Independent component analysis in the presence of gaussian noise by maximizing joint likelihood.
- Hyvärinen
- 1998
(Show Context)
Citation Context ... detect the independent 1This process of orthogonalizing the latent signals was called quasi-whitening in [2] and later in [3]. However, this conflicts with the definition of quasi-whitening given in =-=[12]-=- which requires the latent signals to be whitened. To avoid the confusion we will use the term quasi-orthogonalization for the process of orthogonalizing the latent signals. 2 components) 100 000 poin... |

47 | Learning mixtures of spherical gaussians: moment methods and spectral decompositions.
- Hsu, Kakade
- 2013
(Show Context)
Citation Context ...urth cumulant. In section 3, we discuss the connection between the fourth-order cumulant tensor method for quasi-orthogonalization discussed in section 2 with Hessian-based techniques seen in [2] and =-=[11]-=-. We use this connection to create a more computationally efficient and practically implementable version of the quasi-orthogonalization algorithm discussed in section 2. In section 4, we discuss new,... |

42 | Bayesian robust principal component analysis. Image Processing,
- Ding, He, et al.
- 2011
(Show Context)
Citation Context ...pers that considered the problem of performing PCA in a noisy framework. [5] gives a provably robust algorithm for PCA under a sparse noise model. [4] performs PCA robust to white Gaussian noise, and =-=[9]-=- performs PCA robust to white Gaussian noise and sparse noise. 2 Using Cumulants to Orthogonalize the Independent Components Properties of Cumulants: Cumulants are similar to moments and can be expres... |

29 |
Blind source separation via the second characteristic function.
- Yeredor
- 2000
(Show Context)
Citation Context ... algorithm with complete theoretical analysis was provided in [3]. That algorithm required estimating the full fourth-order cumulant tensor. We note that Hessian based techniques for ICA were used in =-=[21, 2, 11]-=-, with [11] and [2] using the Hessian of the fourth-order cumulant. The papers [21] and [11] proposed interesting randomized one step noise-robust ICA algorithms based on the cumulant generating funct... |

27 | Learning a parallelepiped: Cryptanalysis of GGH and NTRU signatures
- Nguyen, Regev
(Show Context)
Citation Context ...e. We show how these convergence rates follow directly from the properties of the cumulants, which sheds some light on the somewhat surprising cubic convergence seen in fourth-order based ICA methods =-=[13, 18, 22]-=-. The update step has complexity O(Nd) where N is the number of samples, giving a total algorithmic complexity of O(Nd3) for step 1 and O(Nd2t) for step 2, where t is the number of iterations for conv... |

25 |
Handbook of Blind Source Separation
- Comon, Jutten
- 2010
(Show Context)
Citation Context ...ive literature on ICA in the signal processing and machine learning communities due to its applicability to a variety of important practical situations. For a comprehensive introduction see the books =-=[8, 14]-=-. In this paper we develop techniques for dealing with noisy data by introducing new and more efficient techniques for quasi-orthogonalization and subsequent component recovery. The quasi-orthogonaliz... |

12 | Class of complex ICA algorithm based on the Kurtosis cost function, - Li, Adali - 2008 |

10 | Provable ICA with unknown Gaussian noise, with implications for Gaussian mixtures and autoencoders.
- Arora, Ge, et al.
- 2012
(Show Context)
Citation Context ...process is no longer justified. This failure may be even more significant if the noise is not white, which is likely to be the case in many practical situations. Recent theoretical developments (see, =-=[2]-=- and [3]) consider the case where the noise η is an arbitrary (not necessarily white) additive Gaussian variable drawn independently from s. In [2], it was observed that certain cumulant-based techniq... |

6 | Robust Higher Order Statistics
- Welling
- 2001
(Show Context)
Citation Context ...tlab implementation of GI-ICA is available for download at http://sourceforge. net/projects/giica/. Finally, we observe that our method is partially compatible with the robust cumulants introduced in =-=[20]-=-. We briefly discuss how GI-ICA can be extended using these noise-robust techniques for ICA to reduce the impact of sparse noise. The paper is organized as follows. In section 2, we discuss the releva... |

4 | Blind signal separation in the presence of Gaussian noise. - Belkin, Rademacher, et al. - 2013 |

2 |
et al. A new learning algorithm for blind signal separation. Advances in neural information processing systems
- Amari, Cichocki, et al.
- 1996
(Show Context)
Citation Context ...ssian noise. We define 100% noise magnitude to mean variance 10, with 25% noise and 50% noise indicating variances 2.5 and 5 respectively. Performance was measured using the Amari Index introduced in =-=[1]-=-. Let B̂ denote the approximate demixing matrix returned by an ICA algorithm, and let M = B̂A. Then, the Amari index is given by: E := ∑n i=1 ∑n j=1 ( |mij | maxk |mik| − 1 ) +∑n j=1 ∑n i=1 ( |mij | m... |

1 |
Matlab JADE for real-valued data v 1.8. http:// perso.telecom-paristech.fr/˜cardoso/Algo/Jade/jadeR.m, 2005. [Online; accessed 8-May-2013
- Cardoso, Souloumiac
(Show Context)
Citation Context ...orithm 2, then O(Nd2t) steps are required to recover all columns of R once quasi-orthogonalization has been achieved. 6 Simulation Results In Figure 1, we compare our algorithms to the baselines JADE =-=[7]-=- and versions of FastICA [10], using the code made available by the authors. Except for the choice of the contrast function for FastICA the baselines were run using default settings. All tests were do... |

1 |
Matlab FastICA v 2.5. http:// research.ics.aalto.fi/ica/fastica/code/dlcode.shtml, 2005. [Online; accessed 1-May-2013
- Gävert, Hurri, et al.
(Show Context)
Citation Context ... are required to recover all columns of R once quasi-orthogonalization has been achieved. 6 Simulation Results In Figure 1, we compare our algorithms to the baselines JADE [7] and versions of FastICA =-=[10]-=-, using the code made available by the authors. Except for the choice of the contrast function for FastICA the baselines were run using default settings. All tests were done using artificially generat... |

1 |
What are cumulants
- Mafttner
- 1999
(Show Context)
Citation Context ... respect to the additive Gaussian noise, the proposed methods will be admissible for both standard and noisy ICA. While cumulants are essentially unique with the additivity and homogeneity properties =-=[17]-=- when no restrictions are made on the probability space, the preprocessing step of ICA gives additional structure (like orthogonality and centering), providing additional admissible functions. In part... |

1 |
Matlab GI-ICA implementation. http:// sourceforge.net/projects/giica
- Voss, Rademacher, et al.
- 2013
(Show Context)
Citation Context ... for the choice of the contrast function for FastICA the baselines were run using default settings. All tests were done using artificially generated data. In implementing our algorithms (available at =-=[19]-=-), we opted to enforce orthogonality during the update step of Algorithm 2 with previously found columns of R. In Figure 1, comparison on five distributions indicates that each of the independent coor... |

1 |
How fast is FastICA. EUSIPCO
- Zarzoso, Comon
- 2006
(Show Context)
Citation Context ...e. We show how these convergence rates follow directly from the properties of the cumulants, which sheds some light on the somewhat surprising cubic convergence seen in fourth-order based ICA methods =-=[13, 18, 22]-=-. The update step has complexity O(Nd) where N is the number of samples, giving a total algorithmic complexity of O(Nd3) for step 1 and O(Nd2t) for step 2, where t is the number of iterations for conv... |