Kompella, Varun and Luciw, Matthew and Schmidhuber, Juergen (2012) Incremental Slow Feature Analysis: Adaptive Low-Complexity Slow Feature Updating from High-Dimensional Input Streams. Neural Computation, 22 (11). pp. 2994-3024. ISSN 0899-7667
![]() |
Text
NECO-11-11-1588R2.pdf Download (2MB) |
Abstract
We introduce here an incremental version of Slow Feature Analysis (IncSFA), combining Candid Covariance-Free Incremental Principal Components Analysis (CCIPCA) and Covariance-Free Incremental Minor Components Analysis (CIMCA). IncSFA’s feature updating complexity is linear with respect to the input dimensionality, while batch SFA’s (BSFA) updating complexity is cubic. IncSFA does not need to store, or even compute, any covariance matrices. The drawback to IncSFA is data-efficiency: it does not use each data point as effectively as BSFA. But IncSFA allows SFA to be tractably applied, with just a few parameters, directly on high-dimensional input streams (e.g., visual input of an autonomous agent), while BSFA has to resort to hierarchical receptive-field based architectures when the input dimension is too high. Further, IncSFA’s updates have simple Hebbian and anti-Hebbian forms, extending the biological plausibility of SFA. Experimental results show IncSFA learns the same set of features as BSFA and can handle a few cases where BSFA fails.
Item Type: | Scientific journal article, Newspaper article or Magazine article |
---|---|
Uncontrolled Keywords: | SFA (Slow Feature Analysis), Incremental Learning, PCA (Principal Component Analysis), MCA (Minor Component Analysis) |
Subjects: | Computer sciences > Artificial intelligence > Artificial intelligence not elsewhere classified |
Depositing User: | Matthew Luciw |
Date Deposited: | 11 Feb 2014 13:15 |
Last Modified: | 23 May 2016 08:49 |
URI: | http://repository.supsi.ch/id/eprint/3694 |
Actions (login required)
![]() |
View Item |