Abstract
Reservoir Computing is a relatively new field of Recurrent Neural Networks in which only the output weights are re-calculated by the training process, removing the problems associated with traditional gradient descent algorithms. As the reservoir is recurrent, it can possess short term memory, but there is a trade-off between the amount of memory a reservoir can have and its nonlinear mapping capabilities. A new, custom architecture was recently proposed to overcome this by combining a reservoir with an extreme learning machine to deliver improved results. This paper extends this architecture further by introducing a ranking and pruning algorithm which removes neurons according to their significance. This provides further insight into the type of reservoir characteristics needed for a given task, and is supported by further reservoir measures of non-linearity and memory. These techniques are demonstrated on artificial and real world data.
| Original language | English |
|---|---|
| Title of host publication | Proceedings of the 2010 IEEE International Workshop on Machine Learning for Signal Processing, MLSP 2010 |
| Publisher | IEEE |
| ISBN (Electronic) | 978-1-4244-7877-4 |
| ISBN (Print) | 978-1-4244-7875-0 |
| DOIs | |
| Publication status | Published - 7 Oct 2010 |
| Event | 2010 IEEE International Workshop on Machine Learning for Signal Processing - Kittila, Finland Duration: 29 Aug 2010 → 1 Sept 2010 |
Conference
| Conference | 2010 IEEE International Workshop on Machine Learning for Signal Processing |
|---|---|
| Country/Territory | Finland |
| City | Kittila |
| Period | 29/08/10 → 1/09/10 |
Fingerprint
Dive into the research topics of 'Pruning reservoirs with random static projections'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver