DMCA
Fast and robust fixed-point algorithms for independent component analysis (1999)
Cached
Download Links
- [www.cs.helsinki.fi]
- [www.cis.hut.fi]
- [www.cis.hut.fi]
- [www.cis.hut.fi]
- [www.cs.helsinki.fi]
- DBLP
Other Repositories/Bibliography
Venue: | IEEE TRANS. NEURAL NETW |
Citations: | 883 - 34 self |
Citations
12409 | Elements of information theory - Cover, Thomas - 1991 |
1850 | Independent component analysis, a new concept - Comon - 1994 |
1494 | An information-maximization approach to blind separation and blind deconvolution
- Bell, Sejnowski
- 1995
(Show Context)
Citation Context ... ICA data model, minimization of mutual information, and projection pursuit One popular way of formulating the ICA problem is to consider the estimation of the following generative model for the data =-=[1, 3, 5, 6, 23, 24, 27, 28, 31]-=-: x = As (2) where x is an observed m-dimensional vector, s is an n-dimensional (latent) random vector whose components are assumed mutually independent, and A is a constant m × n matrix to be estimat... |
1302 |
Emergence of simple-cell receptive field properties by learning a sparse code for natural images
- Olshausen, Field
- 1996
(Show Context)
Citation Context ...ent of the th feature in the observed data vector . The use of ICA for feature extraction is motivated by results in neurosciences that suggest that the similar principle of redundancy reduction [2], =-=[32]-=- explains some aspects of the early processing of sensory data by the brain. ICA has also applications in exploratory data analysis in the same way as the closely related method of projection pursuit ... |
1239 | Optimization by Vector Space Methods
- LUENBERGER
- 1969
(Show Context)
Citation Context ..., we shall derive the fixed-point algorithm for one unit, with sphered data. First note that the maxima of JG(w) are obtained at certain optima of E{G(w T x)}. According to the Kuhn-Tucker conditions =-=[29]-=-, the optima of E{G(w T x)} under the constraint E{(w T x) 2 } = �w� 2 = 1 are obtained at points where E{xg(w T x)} − βw = 0 (17) where β is a constant that can be easily evaluated to give β = E{w T ... |
651 | A fixed-point algorithm for independent component analysis
- Hyvärinen, Oja
- 1997
(Show Context)
Citation Context ...ltiplying both sides of the first equation in (19) by β − E{g ′ (w T x)}. This gives the following fixed-point algorithm w + = E{xg(w T x)} − E{g ′ (w T x)}w w ∗ = w + /�w + � which was introduced in =-=[17]-=- using a more heuristic derivation. An earlier version (for kurtosis only) was derived as a fixed-point iteration of a neural learning rule in [23], which is where its name comes from. We retain this ... |
616 | The “independent components” of natural scenes are edge filters
- Bell, Sejnowski
- 1997
(Show Context)
Citation Context ...each other, and thus the signals can be recovered from linear mixtures xi by finding a transformation in which the transformed signals are as independent as possible, as in ICA. In feature extraction =-=[4, 25]-=-, si is the coefficient of the i-th feature in the observed data vector x. The use of ICA for feature extraction is motivated by results in neurosciences that suggest that the similar principle of red... |
559 |
Blind separation of sources, part i: An adaptive algorithm based on neuromimetic architecture
- Jutten, Hérault
- 1991
(Show Context)
Citation Context ...es and methods have been developed to find such a linear representation, including principal component analysis [30], factor analysis [15], projection pursuit [12, 16], independent component analysis =-=[27]-=-, etc. The transformation may be defined using such criteria as optimal dimension reduction, statistical ’interestingness’ of the resulting components si, simplicity of the transformation, or other cr... |
493 |
Modern factor analysis
- Harman
- 1967
(Show Context)
Citation Context ...ariables s = Wx (1) has some suitable properties. Several principles and methods have been developed to find such a linear representation, including principal component analysis [30], factor analysis =-=[15]-=-, projection pursuit [12, 16], independent component analysis [27], etc. The transformation may be defined using such criteria as optimal dimension reduction, statistical ’interestingness’ of the resu... |
471 |
Possible principles underlying the transformations of sensorymessages.
- Barlow
- 1961
(Show Context)
Citation Context ...nsformation is to find a representation in which the transformed components si are statistically as independent from each other as possible. Thus this method is a special case of redundancy reduction =-=[2]-=-. Two promising applications of ICA are blind source separation and feature extraction. In blind source separation [27], the observed values of x correspond to a realization of an m-dimensional discre... |
356 | Independent component filters of natural images compared with simple cells in primary visual cortex
- Hateren, Schaaf
- 1998
(Show Context)
Citation Context ...t functions and algorithms introduced above. These applications include artifact cancellation in EEG and MEG [36, 37], decomposition of evoked fields in MEG [38], and feature extraction of image data =-=[35, 25]-=-. These experiments further validate the ICA methods introduced in this paper. A Matlab T M implementation of the fixed-algorithm is available on the World Wide Web free of charge [10]. 10slog10 of es... |
340 |
A projection pursuit algorithm for exploratory data analysis,”
- Friedman, Tukey
- 1974
(Show Context)
Citation Context ...he mutual information is roughly equivalent to finding directions in which the negentropy is maximized. This formulation of ICA also shows explicitly the connection between ICA and projection pursuit =-=[11, 12, 16, 26]-=-. In fact, finding a single direction that maximizes negentropy is a form of projection pursuit, and could also be interpreted as estimation of a single independent component [24]. 2.2 Contrast Functi... |
320 | Exploratory projection pursuit
- Friedman
- 1987
(Show Context)
Citation Context ...ome suitable properties. Several principles and methods have been developed to find such a linear representation, including principal component analysis [30], factor analysis [15], projection pursuit =-=[12, 16]-=-, independent component analysis [27], etc. The transformation may be defined using such criteria as optimal dimension reduction, statistical ’interestingness’ of the resulting components si, simplici... |
299 |
Projection pursuit
- Huber
- 1985
(Show Context)
Citation Context ...ome suitable properties. Several principles and methods have been developed to find such a linear representation, including principal component analysis [30], factor analysis [15], projection pursuit =-=[12, 16]-=-, independent component analysis [27], etc. The transformation may be defined using such criteria as optimal dimension reduction, statistical ’interestingness’ of the resulting components si, simplici... |
173 |
Adaptive blind separation of independent sources: a deflation approach.
- Delfosse, Loubaton
- 1995
(Show Context)
Citation Context ...and distributions of the si. In particular, if G(u) = u 4 , the condition is fulfilled for any distribution of non-zero kurtosis. In that case, it can also be proven that there are no spurious optima =-=[9]-=-. 3.1.2 Asymptotic variance Asymptotic variance is one criterion for choosing the function G to be used in the contrast function. Comparison of, say, the traces of the asymptotic covariance matrices o... |
170 |
Independent component analysis—a new concept
- Comon
- 1994
(Show Context)
Citation Context ... the transformation, or other criteria, including application-oriented ones. We treat in this paper the problem of estimating the transformation given by (linear) independent component analysis (ICA) =-=[7, 27]-=-. As the name implies, the basic goal in determining the transformation is to find a representation in which the transformed components si are statistically as independent from each other as possible.... |
154 |
Principal components, minor components, and linear neural networks
- Oja
- 1992
(Show Context)
Citation Context ...tion of the observed variables s = Wx (1) has some suitable properties. Several principles and methods have been developed to find such a linear representation, including principal component analysis =-=[30]-=-, factor analysis [15], projection pursuit [12, 16], independent component analysis [27], etc. The transformation may be defined using such criteria as optimal dimension reduction, statistical ’intere... |
138 |
A class of neural networks for independent component analysis[J].
- Karhunen, Oja, et al.
- 1997
(Show Context)
Citation Context ... ICA data model, minimization of mutual information, and projection pursuit One popular way of formulating the ICA problem is to consider the estimation of the following generative model for the data =-=[1, 3, 5, 6, 23, 24, 27, 28, 31]-=-: x = As (2) where x is an observed m-dimensional vector, s is an n-dimensional (latent) random vector whose components are assumed mutually independent, and A is a constant m × n matrix to be estimat... |
134 |
Separation of a mixture of independent sources through a maximum likelihood approach,”
- Pham, Garrat, et al.
- 1992
(Show Context)
Citation Context ... of si, and k1,k2,k3 are arbitrary constants. For simplicity, one can choose Gopt(u) = log fi(u). Thus the optimal contrast function is the same as the one obtained by the maximum likelihood approach =-=[34]-=-, or the infomax approach [3]. Almost identical results have also been obtained in [5] for another algorithm. The theorem above treats, however, the one-unit case instead of the multi-unit case treate... |
126 |
Robust Statistics
- Hampel, Ronchetti, et al.
- 1986
(Show Context)
Citation Context ...treats, however, the one-unit case instead of the multi-unit case treated by the other authors. 4 (10)s3.1.3 Robustness Another very attractive property of an estimator is robustness against outliers =-=[14]-=-. This means that single, highly erroneous observations do not have much influence on the estimator. To obtain a simple form of robustness called B-robustness, we would like the estimator to have a bo... |
118 |
A new learning algorithm for blind source separation
- Amari, Cichocki, et al.
- 1996
(Show Context)
Citation Context ... ICA data model, minimization of mutual information, and projection pursuit One popular way of formulating the ICA problem is to consider the estimation of the following generative model for the data =-=[1, 3, 5, 6, 23, 24, 27, 28, 31]-=-: x = As (2) where x is an observed m-dimensional vector, s is an n-dimensional (latent) random vector whose components are assumed mutually independent, and A is a constant m × n matrix to be estimat... |
112 |
What is projection pursuit?"
- Jones, Sibson
- 1987
(Show Context)
Citation Context ...he mutual information is roughly equivalent to finding directions in which the negentropy is maximized. This formulation of ICA also shows explicitly the connection between ICA and projection pursuit =-=[11, 12, 16, 26]-=-. In fact, finding a single direction that maximizes negentropy is a form of projection pursuit, and could also be interpreted as estimation of a single independent component [24]. 2.2 Contrast Functi... |
107 | Natural image statistics and efficient coding
- Olshausen, Field
- 1996
(Show Context)
Citation Context ...cient of the i-th feature in the observed data vector x. The use of ICA for feature extraction is motivated by results in neurosciences that suggest that the similar principle of redundancy reduction =-=[2, 32]-=- explains some aspects of the early processing of sensory data by the brain. ICA has also applications in exploratory data analysis in the same way as the closely related method of projection pursuit ... |
101 | New approximations of differential entropy for independent component analysis and projection pursuit.
- Hyvärinen
- 1998
(Show Context)
Citation Context ...ough Approximations of Negentropy To use the definition of ICA given above, a simple estimate of the negentropy (or of differential entropy) is needed. We use here the new approximations developed in =-=[19]-=-, based on the maximum entropy principle. In [19] it was shown that these approximations are often considerably more accurate than the conventional, cumulant-based approximations in [7, 1, 26]. In the... |
78 | Independent component analysis by general nonlinear Hebbian-like learning rules,
- Hyvarinen, Oja
- 1998
(Show Context)
Citation Context ... ICA data model, minimization of mutual information, and projection pursuit One popular way of formulating the ICA problem is to consider the estimation of the following generative model for the data =-=[1, 3, 5, 6, 23, 24, 27, 28, 31]-=-: x = As (2) where x is an observed m-dimensional vector, s is an n-dimensional (latent) random vector whose components are assumed mutually independent, and A is a constant m × n matrix to be estimat... |
65 | Independent component analysis for identification of artifacts in magnetoencephalographic recordings
- VIGARIO, JOUSMÄKI, et al.
- 1998
(Show Context)
Citation Context ...Experiments on different kinds of real life data have also been performed using the contrast functions and algorithms introduced above. These applications include artifact cancellation in EEG and MEG =-=[36, 37]-=-, decomposition of evoked fields in MEG [38], and feature extraction of image data [35, 25]. These experiments further validate the ICA methods introduced in this paper. A Matlab T M implementation of... |
64 |
The nonlinear PCA learning rule in independent component analysis.
- Oja
- 1997
(Show Context)
Citation Context ... ICA data model, minimization of mutual information, and projection pursuit One popular way of formulating the ICA problem is to consider the estimation of the following generative model for the data =-=[1, 3, 5, 6, 23, 24, 27, 28, 31]-=-: x = As (2) where x is an observed m-dimensional vector, s is an n-dimensional (latent) random vector whose components are assumed mutually independent, and A is a constant m × n matrix to be estimat... |
64 | A fast �xed-point algorithm for independent component analysis,” - Hyvärinen, Oja - 1997 |
63 | Emergence of simple-cell receptive eld properties by learning a sparse code for natural images - Olshausen, Field - 1996 |
48 |
Extraction of ocular artifacts from EEG using independent component analysis[J]. Electroencephalography and Clinical Neurophysiology,
- Viga´rio
- 1997
(Show Context)
Citation Context ...Experiments on different kinds of real life data have also been performed using the contrast functions and algorithms introduced above. These applications include artifact cancellation in EEG and MEG =-=[36, 37]-=-, decomposition of evoked fields in MEG [38], and feature extraction of image data [35, 25]. These experiments further validate the ICA methods introduced in this paper. A Matlab T M implementation of... |
48 | The &independent components' of natural scenes are edge "lters, Vision Res. - Bell, Sejnowski - 1997 |
47 |
Laheld."Equivariant adaptive source separation
- Cardoso, H
- 1996
(Show Context)
Citation Context ... ICA data model, minimization of mutual information, and projection pursuit One popular way of formulating the ICA problem is to consider the estimation of the following generative model for the data =-=[1, 3, 5, 6, 23, 24, 27, 28, 31]-=-: x = As (2) where x is an observed m-dimensional vector, s is an n-dimensional (latent) random vector whose components are assumed mutually independent, and A is a constant m × n matrix to be estimat... |
33 | The fixed-point algorithm and maximum likelihood estimation for independent component analysis,”
- Hyvarinen
- 1999
(Show Context)
Citation Context ...neither sphering nor inversion of the covariance matrix is needed. In fact, such an algorithm can be considered as a fixed-point algorithm for maximum likelihood estimation of the ICA data model, see =-=[21]-=-. 4.4 Properties of the Fixed-Point Algorithm The fixed-point algorithm and the underlying contrast functions have a number of desirable properties when compared with existing methods for ICA. • The c... |
27 | One-unit contrast functions for independent component analysis: A statistical analysis,
- Hyvarinen
- 1997
(Show Context)
Citation Context ... G to be used in the contrast function. Comparison of, say, the traces of the asymptotic covariance matrices of two estimators enables direct comparison of the mean-square error of the estimators. In =-=[18]-=-, evaluation of asymptotic variances was addressed using a related family of contrast functions. In fact, it can be seen that the results in [18] are valid even in this case, and thus we have the foll... |
24 |
Modern Factor Analysis, 2nd ed.
- Harman
- 1971
(Show Context)
Citation Context ... observed variables has some suitable properties. Several principles and methods have been developed to find such a linear representation, including principal component analysis [30], factor analysis =-=[15]-=-, projection pursuit [12], [16], independent component analysis [27], etc. The transformation may be defined using such criteria as optimal dimension reduction, statistical “interestingness” of the re... |
22 | A fast algorithm for estimating overcomplete ICA bases for image windows
- Hyvärinen, Cristescu, et al.
- 1999
(Show Context)
Citation Context ...alidated the novel contrast functions and algorithms introduced. Some extensions of the methods introduced in this paper are presented in [20], in which the problem of noisy data is addressed, and in =-=[22]-=-, which deals with the situation where there are more independent components than observed variables. APPENDIX A PROOFS A. Proof of Convergence of Algorithm (20) The convergence is proven under the as... |
22 | Hurri,―Image feature extraction by sparse coding and independent component analysis‖,
- Hyvärinen, Oja, et al.
- 1998
(Show Context)
Citation Context ...t functions and algorithms introduced above. These applications include artifact cancellation in EEG and MEG [36, 37], decomposition of evoked fields in MEG [38], and feature extraction of image data =-=[35, 25]-=-. These experiments further validate the ICA methods introduced in this paper. A Matlab T M implementation of the fixed-algorithm is available on the World Wide Web free of charge [10]. 10slog10 of es... |
16 | A family of ��xed-point algorithms for independent component analysis - Hyvrinen - 1997 |
16 | Independent component analysis by general non-linear hebbian-like learning rules - Hyvrinen, Oja - 1998 |
16 | Independent component lters of natural images compared with simple cells in primary visual cortex - Hateren, Schaaf - 1998 |
10 | An experimental comparison of neural ICA algorithms
- Giannakopoulos, Karhunen, et al.
- 1998
(Show Context)
Citation Context ... to achieve the maximum accuracy allowed by the data. This illustrates the fast convergence of the fixed-point algorithm. In fact, a comparison of our algorithm with other algorithms was performed in =-=[13]-=-, showing that the fixed-point algorithm gives approximately the same statistical efficiency as other algorithms, but with a fraction of the computational cost. Experiments on different kinds of real ... |
10 | Independent component analysis in wave decomposition of auditory evoked fields
- VIGARIO, SÄRELÄ, et al.
(Show Context)
Citation Context ... have also been performed using the contrast functions and algorithms introduced above. These applications include artifact cancellation in EEG and MEG [36, 37], decomposition of evoked fields in MEG =-=[38]-=-, and feature extraction of image data [35, 25]. These experiments further validate the ICA methods introduced in this paper. A Matlab T M implementation of the fixed-algorithm is available on the Wor... |
6 | Fast independent component analysis with noisy data using gaussian moments - Hyvärinen - 1999 |
5 | New approximations of dioeerential entropy for independent component analysis and projection pursuit - Hyvrinen - 1998 |
5 | Fast algorithm for estimating overcomplete ICA bases for image windows - Cristescu, Oja - 1999 |
5 | Independent component analysis for identication of artifacts in magnetoencephalographic recordings - Vigrio, Jousmki, et al. - 1998 |
5 |
approximations of differential entropy for independent component analysis and projection pursuit,”
- “New
- 1998
(Show Context)
Citation Context ...ough Approximations of Negentropy To use the definition of ICA given above, a simple estimate of the negentropy (or of differential entropy) is needed. We use here the new approximations developed in =-=[19]-=-, based on the maximum entropy principle. In [19] it was shown that these approximations are often considerably more accurate than the conventional, cumulant-based approximations in [1], [7], and [26]... |
4 |
Neural Networks for Signal Processing and Optimization
- Cichocki, Unbehauen
- 1994
(Show Context)
Citation Context ... ICA data model, minimization of mutual information, and projection pursuit One popular way of formulating the ICA problem is to consider the estimation of the following generative model for the data =-=[1, 3, 5, 6, 23, 24, 27, 28, 31]-=-: x = As (2) where x is an observed m-dimensional vector, s is an n-dimensional (latent) random vector whose components are assumed mutually independent, and A is a constant m × n matrix to be estimat... |
4 | The xed-point algorithm and maximum likelihood estimation for independent component analysis - Hyvrinen - 1999 |
3 | One-unit contrast functions for independent component analysis: A statistical analysis - Hyvrinen - 1997 |
3 | Extraction of ocular artifacts from EEG using independent component analysis - Vigrio - 1997 |
2 | The FastICA MATLAB package. Available at http://www.cis.hut.fi/projects/ica/fastica/. [11 - Friedman, Tukey - 1974 |
1 | Fast independent component analysis with noisy data using gaussian moments - Hyvrinen - 1999 |
1 | Independent component analysis in wave decomposition of auditory evoked ��elds - Vigrio, Srel, et al. - 1998 |
1 |
contrast functions for independent component analysis: A statistical analysis
- “One-unit
- 1997
(Show Context)
Citation Context ...on to be used in the contrast function. Comparison of, say, the traces of the asymptotic covariance matrices of two estimators enables direct comparison of the mean-square error of the estimators. In =-=[18]-=-, evaluation of asymptotic variances was addressed using a related family of contrast functions. In fact, it can be seen that the results in [18] are valid even in this case, and thus we have the foll... |
1 |
independent component analysis with noisy data using gaussian moments
- “Fast
- 1999
(Show Context)
Citation Context .... Simulations as well as applications on real-life data have validated the novel contrast functions and algorithms introduced. Some extensions of the methods introduced in this paper are presented in =-=[20]-=-, in which the problem of noisy data is addressed, and in [22], which deals with the situation where there are more independent components than observed variables. APPENDIX A PROOFS A. Proof of Conver... |
1 |
component analysis by general nonlinear Hebbianlike learning rules
- “Independent
- 1998
(Show Context)
Citation Context ...e limited to: Helsingin Yliopisto. Downloaded on March 23, 2009 at 08:38 from IEEE Xplore. Restrictions apply.HYVÄRINEN: FAST AND ROBUST FIXED-POINT ALGORITHMS 627 the data [1], [3], [5], [6], [23], =-=[24]-=-, [27], [28], [31]: where is an observed -dimensional vector, is an - dimensional (latent) random vector whose components are assumed mutually independent, and is a constant matrix to be estimated. It... |
1 |
nonlinear PCA learning rule in independent component analysis
- “The
- 1997
(Show Context)
Citation Context ...ingin Yliopisto. Downloaded on March 23, 2009 at 08:38 from IEEE Xplore. Restrictions apply.HYVÄRINEN: FAST AND ROBUST FIXED-POINT ALGORITHMS 627 the data [1], [3], [5], [6], [23], [24], [27], [28], =-=[31]-=-: where is an observed -dimensional vector, is an - dimensional (latent) random vector whose components are assumed mutually independent, and is a constant matrix to be estimated. It is usually furthe... |
1 | Modern Factor Analysis. University of Chicago Press, 2nd edition - Wiley - 1986 |