Advances in Biologically Inspired Reservoir Computing

S Scardapone, John Butcher, SM Bianchi, ZK Malik

Research output: Contribution to journalArticlepeer-review


The interplay between randomness and optimization has always been a major theme in the design of neural networks [3]. In the last 15 years, the success of reservoir computing (RC) showed that, in many scenarios, the algebraic structure of the recurrent component is far more important than the precise fine-tuning of its weights. As long as the recurrent part of the network possesses a form of fading memory of the input, the dynamics of the neurons are enough to efficiently process many spatio-temporal signals, provided that their activations are sufficiently heterogeneous. Even if today it is feasible to fully optimize deep recurrent networks, their implementation still requires a vast degree of experience and practice, not to mention vast computational resources, limiting their applicability in simpler architectures (e.g., embedded systems) or in areas where time is of key importance (e.g., online systems). Not surprisingly, then, RC remains a powerful tool for quickly solving dynamical problems, and it has become an invaluable tool for modeling and analysis in neuroscience.
Original languageEnglish
Pages (from-to)295–296
JournalCognitive Computation
Issue number3
Early online date28 Apr 2017
Publication statusPublished - 1 Jun 2017

Bibliographical note

© Springer Nature B.V. 2017. The final publication is available at Springer via


Dive into the research topics of 'Advances in Biologically Inspired Reservoir Computing'. Together they form a unique fingerprint.

Cite this