Learning under time-varying environment is a challenging task since one has to deal with the ever changing distribution of data. A common and yet effective solution is to learn the data online and keep up with any ongoing changes. The Quantized Kernel Least-Squares (QKLMS) is an effective tool for online dictionary learning where the network size is capped by the quantization dictionary size. However, due to the lack of a mechanism to eliminate outdated words, learning can become irrelevant over time. In this paper, a mechanism to remove irrelevant words in the dictionary is proposed for QKLMS. Our experimental results based on chaotic time sequence prediction validate the capability of the developed method for time-varying data adaptation.
|Title of host publication||2015 IEEE International Conference on Digital Signal Processing, DSP 2015|
|Publisher||Institute of Electrical and Electronics Engineers Inc.|
|Number of pages||5|
|ISBN (Electronic)||9781479980581, 9781479980581|
|Publication status||Published - 2015 Sep 9|
|Event||IEEE International Conference on Digital Signal Processing, DSP 2015 - Singapore, Singapore|
Duration: 2015 Jul 21 → 2015 Jul 24
|Name||International Conference on Digital Signal Processing, DSP|
|Other||IEEE International Conference on Digital Signal Processing, DSP 2015|
|Period||15/7/21 → 15/7/24|
Bibliographical notePublisher Copyright:
© 2015 IEEE.
All Science Journal Classification (ASJC) codes
- Signal Processing