Independent component analysis (ICA) by entropy bound minimization (EBM) and entropy rate minimization (ERM)

We present three real-valued ICA algorithms that account for nonorthogonal separation of the sources.Entropy bound, rather than entropy, is minimized. It is known that in ICA, there is no need to estimate the density or entropy of sources with great precision. In these algorithms, instead of minimizing the entropy, we are minimizing the entropy bound, which can be quickly calculated using numerical methods. A decoupling technique is used for the optimization of separation matrix. Specifically, the separation matrix is optimized row by row, as in an orthogonal separation algorithm. This decoupling technique makes the fast nonorthogonal separation of a large number of sources be possible.

  1. Real-valued ICA by entropy bound minimization (ICA-EBM) [1,2]
  2. Real-valued ICA by entropy rate bound minimization (ICA-ERBM) [3]
  3. Real-valued ICA by entropy rate minimization (ICA-ERM) [4]

Real-valued ICA-EBM

ICA by entropy bound minimization provides flexible density matching through use of four measuring functions based on the maximum entropy principle. Four nonlinearities are used as measuring functions for calculating the entropy bound, and the associated maximum entropy density can be symmetric or skewed, heavy-tailed or not heavy-tailed.

Real-valued ICA-ERBM

ICA by entropy rate bound minimization takes both non-Gaussianity and sample correlation into account by minimizing mutual information rate. It is originally introduced as Full Blind Source Separation (FBSS). The algorithm By assuming the sources are outputs of linear systems driven by independently and identically distributed (i.i.d.) noise, the entropy rate estimation problem is converted to an entropy estimation problem solved using EBM.

Real-valued ICA-ERM

ICA by entropy rate minimization (ERM) methods use the mutual information rate, which leads to the minimization of entropy rate, as the cost function to take both non-Gaussianity and sample dependence into account. The estimation of entropy rate is the most difficult part of the problem. The ERM methods estimate entropy rate by assuming Markovian or invertible filter source model. Entropy rate minimization via multivariate generalized Gaussian distribution (ERM-MG) assumes Markovian model with multivariate generalized Gaussian distribution as the source prior. Both entropy rate minimization via AR driven by GGD process (ERM-ARG) and entropy rate bound minimization via AR source (ERBM-AR) assume the source is generated by an AR model driven by an i.i.d.~process. For ERM-ARG, the innovation process is modeled by a generalized Gaussian distribution. For ERBM-AR, the distribution of the innovation process is assumed to be unknown and modeled by a maximum entropy distribution.


References:

[1] X.-L. Li and T. Adali, "A novel entropy estimator and its application to ICA," in Proc. IEEE Workshop on Machine Learning for Signal Processing (MLSP), Grenoble, France, Sep. 2009
[2] X.-L. Li and T. Adali, "Independent component analysis by entropy bound minimization," IEEE Trans. Signal Processing, vol. 58, no. 10, pp. 5151-5164, Oct. 2010.
[3] X.-L. Li, and T. Adali, "Blind spatiotemporal separation of second and/or higher-order correlated sources by entropy rate minimization," in Proc. IEEE Int. Conf. Acoust., Speech, Signal Processing (ICASSP), Dallas, TX, March 2010.
[4] G.-S. Fu, R. Phlypo, M. Anderson, X.-L. Li, and T. Adali,"Blind source separation by entropy rate minimization," IEEE Trans. Signal Processing, vol. 62, no. 16, pp. 4245-4255, Aug. 2014.