Linear Adaptive Filtering Using Entropy Bound Minimization (EBM)

Adaptive filtering has been extensively studied under the assumption that the noise is Gaussian. The most commonly used least-mean-square-error (LMSE) filter is optimal when the noise is Gaussian. However, in many practical applications, the noise can be modeled more accurately using a non-Gaussian distribution.

We consider non-Gaussian distributions for the noise model and show that the filter of using entropy bound minimization (EBM) leads to significant performance gain compared to the LMSE filter. The least mean p-norm (LMP) filter using the -stable distribution to model noise is shown to be the maximum likelihood solution when using the generalized Gaussian distribution (GGD) to model noise. The GGD model for noise allows us to compute the Cramer-Rao lower bound (CRLB) for the error in estimating the weights. Simulations show that both the EBM and LMP filters achieve the CRLB as the sample size increases. EBM filter is shown to be less committed with respect to unseen data yielding generally superior performance in online learning when compared to LMP. We also show that, when the noise comes from impulsive -stable distributions, both the EBM and LMP filters provide better performance than LMSE. In addition, EBM filter offers the advantage that it does not assume a certain parametric model for the noise and by proper selection of the measuring functions, it can be adapted to a wide range of noise distributions.


References:

[1] H. Li and T. Adali, "A Class of Adaptive Algorithms Based on Entropy Estimation Achieving CRLB for Linear Non-Gaussian Filtering," submitted to IEEE Trans. Signal Processing, 2011.