Invited articleIndependent component analysis: algorithms and applications
Section snippets
Motivation
Imagine that you are in a room where two people are speaking simultaneously. You have two microphones, which you hold in different locations. The microphones give you two recorded time signals, which we could denote by x1(t) and x2(t), with x1 and x2 the amplitudes, and t the time index. Each of these recorded signals is a weighted sum of the speech signals emitted by the two speakers, which we denote by s1(t) and s2(t). We could express this as a linear equation:
Definition of ICA
To rigorously define ICA (Comon, 1994, Jutten and Herault, 1991), we can use a statistical “latent variables” model. Assume that we observe n linear mixtures x1,…,xn of n independent components
We have now dropped the time index t; in the ICA model, we assume that each mixture xj as well as each independent component sk is a random variable, instead of a proper time signal. The observed values xj(t), e.g. the microphone signals in the cocktail party problem, are
Definition and fundamental properties
To define the concept of independence, consider two scalar-valued random variables y1 and y2. Basically, the variables y1 and y2 are said to be independent if information on the value of y1 does not give any information on the value of y2, and vice versa. Above, we noted that this is the case with the variables s1, s2 but not with the mixture variables x1, x2.
Technically, independence can be defined by the probability densities. Let us denote by p(y1,y2) the joint probability density function
“Non-Gaussian is independent”
Intuitively speaking, the key to estimating the ICA model is non-Gaussianity. Actually, without non-Gaussianity the estimation is not possible at all, as mentioned in Section 3.3. This is at the same time probably the main reason for the rather late resurgence of ICA research: In most of classical statistical theory, random variables are assumed to have Gaussian distributions, thus precluding any methods related to ICA.
The Central Limit Theorem, a classical result in probability theory, tells
Preprocessing for ICA
In the preceding section, we discussed the statistical principles underlying ICA methods. Practical algorithms based on these principles will be discussed in the next section. However, before applying an ICA algorithm on the data, it is usually very useful to do some preprocessing. In this section, we discuss some preprocessing techniques that make the problem of ICA estimation simpler and better conditioned.
The FastICA algorithm
In the preceding sections, we introduced different measures of non-Gaussianity, i.e. objective functions for ICA estimation. In practice, one also needs an algorithm for maximizing the contrast function, for example the one in Eq. (25). In this section, we introduce a very efficient method of maximization suited for this task. It is here assumed that the data is preprocessed by centering and whitening as discussed in the preceding section.
Applications of ICA
In this section we review some applications of ICA. The most classical application of ICA, the cocktail-party problem, was already explained in Section 1 of this paper.
Conclusion
ICA is a very general-purpose statistical technique in which observed random data are linearly transformed into components that are maximally independent from each other, and simultaneously have “interesting” distributions. ICA can be formulated as the estimation of a latent variable model. The intuitive notion of maximum non-Gaussianity can be used to derive different objective functions whose optimization enables the estimation of the ICA model. Alternatively, one may use more classical
References (45)
- et al.
The ‘independent components’ of natural scenes are edge filters
Vision Research
(1997) Independent component analysis—a new concept?
Signal Processing
(1994)- et al.
Adaptive blind separation of independent sources: a deflation approach
Signal Processing
(1995) Independent component analysis in the presence of Gaussian noise by maximizing joint likelihood
Neurocomputing
(1998)- et al.
Independent component analysis by general nonlinear Hebbian-like learning rules
Signal Processing
(1998) - et al.
Blind separation of sources, part I: an adaptive algorithm based on neuromimetic architecture
Signal Processing
(1991) Extraction of ocular artifacts from EEG using independent component analysis
Electroencephalography and Clinical Neurophysiology
(1997)- et al.
A new learning algorithm for blind source separation
(1996) - et al.
A first application of independent component analysis to extracting structure from stock returns
International Journal on Neural Systems
(1998) - et al.
An information-maximization approach to blind separation and blind deconvolution
Neural Computation
(1995)
Infomax and maximum likelihood for source separation
IEEE Letters on Signal Processing
Equivariant adaptive source separation
IEEE Transactions on Signal Processing
Robust neural networks with on-line learning for blind identification and blind separation of sources
IEEE Transactions on Circuits and Systems
Elements of information theory
Wavelet shrinkage: asymptopia?
Journal of the Royal Statistical Society, Series B
Exploratory projection pursuit
Journal of the American Statistical Association
A projection pursuit algorithm for exploratory data analysis
IEEE Transactions of Computers
Digital image processing
Projection pursuit
The Annals of Statistics
New approximations of differential entropy for independent component analysis and projection pursuit
Cited by (7353)
Latent model extreme value index estimation
2024, Journal of Multivariate AnalysisAn outlier detection based two-stage EEG artifact removal method using empirical wavelet transform and canonical correlation analysis
2024, Biomedical Signal Processing and ControlStructural safety assessment criteria for dismantling operations of unique structures. San Mames Roof Arch Experience
2024, Journal of Building EngineeringAtypical alpha band microstates produced during eyes-closed resting state EEG in autism
2024, Progress in Neuro-Psychopharmacology and Biological Psychiatry