4.7 Article

Transfer Learning in Adaptive Filters: The Nearest Instance Centroid-Estimation Kernel Least-Mean-Square Algorithm

Journal

IEEE TRANSACTIONS ON SIGNAL PROCESSING
Volume 65, Issue 24, Pages 6520-6535

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TSP.2017.2752695

Keywords

Adaptive filter; kernel methods; kernel adaptive filtering (KAF); kernel least mean square (KLMS); least mean square; mean square convergence; spatial clustering; transfer learning; vector quantization

Funding

  1. DARPA [N66001-10-C-2008, N66001-15-1-4054]

Ask authors/readers for more resources

We propose a novel nearest-neighbors approach to organize and curb the growth of radial basis function network in kernel adaptive filtering (KAF). The nearest-instance-centroid-estimation (NICE) kernel least-mean-square (KLMS) algorithm provides an appropriate time-space tradeoff with good performance. Its centers in the input/feature space are organized by quasi-orthogonal regions for greatly simplified filter evaluation. Instead of using all centers to evaluate/update the function approximation at every new point, a linear search among the iteratively-updated centroids determines the partial function to be used, naturally forming locally-supported partial functionals. Under this framework, partial functionals that compose the adaptive filter are quickly stored/retrieved based on input, each corresponding to a specialized spatial-band subfilter. The filter evaluation becomes the update of one of the subfilters, creating a content addressable filter bank (CAFB). This CAFB is incrementally updated for new signal applications with mild constraints, always using the past-learned partial filter sums, opening the door for transfer learning and significant efficiency for new data scenarios, avoiding training from scratch as have been done since the invention of adaptive filtering. Using energy conservation relation, we show the sufficient condition for mean square convergence of the NICE-KLMS algorithm and establish the upper and lower bounds of steady-state excess-mean-square-error (EMSE). Simulations on chaotic time-series prediction demonstrate similar levels of accuracy as existing methods, but with much faster computation involving fewer input samples. Simulations on transfer learning using both synthetic and real-world data demonstrate that NICE CAFB can leverage previously learned knowledge to related task or domain.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available