Pruning reservoirs with random static projections

J.B. Butcher, C.R. Day, P.W. Haycock, D. Verstraeten, B. Schrauwen

Research output: Chapter in Book/Published conference outputConference publication


Reservoir Computing is a relatively new field of Recurrent Neural Networks in which only the output weights are re-calculated by the training process, removing the problems associated with traditional gradient descent algorithms. As the reservoir is recurrent, it can possess short term memory, but there is a trade-off between the amount of memory a reservoir can have and its nonlinear mapping capabilities. A new, custom architecture was recently proposed to overcome this by combining a reservoir with an extreme learning machine to deliver improved results. This paper extends this architecture further by introducing a ranking and pruning algorithm which removes neurons according to their significance. This provides further insight into the type of reservoir characteristics needed for a given task, and is supported by further reservoir measures of non-linearity and memory. These techniques are demonstrated on artificial and real world data.
Original languageEnglish
Title of host publicationProceedings of the 2010 IEEE International Workshop on Machine Learning for Signal Processing, MLSP 2010
ISBN (Electronic)978-1-4244-7877-4
ISBN (Print)978-1-4244-7875-0
Publication statusPublished - 7 Oct 2010
Event2010 IEEE International Workshop on Machine Learning for Signal Processing - Kittila, Finland
Duration: 29 Aug 20101 Sept 2010


Conference2010 IEEE International Workshop on Machine Learning for Signal Processing


Dive into the research topics of 'Pruning reservoirs with random static projections'. Together they form a unique fingerprint.

Cite this