Skip to main content

Create Efficient and Complex Reservoir Computing Architectures with ReservoirPy

  • Conference paper
  • First Online:

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 13499))

Abstract

Reservoir Computing (RC) is a type of recurrent neural network (RNNs) where learning is restricted to the output weights. RCs are often considered as temporal Support Vector Machines (SVMs) for the way they project inputs onto dynamic non-linear high-dimensional representations. This paradigm, mainly represented by Echo State Networks (ESNs), has been successfully applied on a wide variety of tasks, from time series forecasting to sequence generation. They offer de facto a fast, simple yet efficient way to train RNNs.

We present in this paper a library that facilitates the creation of RC architectures, from simplest to most complex, based on the Python scientific stack (NumPy, Scipy). This library offers memory and time efficient implementations for both online and offline training paradigms, such as FORCE learning or parallel ridge regression. The flexibility of the API allows to quickly design ESNs including re-usable and customizable components. It enables to build models such as DeepESNs as well as other advanced architectures with complex connectivity between multiple reservoirs with feedback loops. Extensive documentation and tutorials both for newcomers and experts are provided through GitHub and ReadTheDocs websites.

The paper introduces the main concepts supporting the library, illustrated with code examples covering popular RC techniques from the literature. We argue that such flexible dedicated library will ease the creation of more advanced architectures while guarantying their correct implementation and reproducibility across the RC community.

Supported by Inria.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Notes

  1. 1.

    Appendices are available at https://github.com/reservoirpy/publications/tree/main/2022-SAB.

References

  1. Alexandre, F., Hinaut, X., Rougier, N., Viéville, T.: Higher cognitive functions in bio-inspired artificial intelligence. ERCIM News 125 (2021)

    Google Scholar 

  2. Asabuki, T., Hiratani, N., Fukai, T.: Interactive reservoir computing for chunking information streams. PLoS Comput. Biol. 14(10), e1006400 (2018)

    Article  Google Scholar 

  3. Dale, M.: Neuroevolution of hierarchical reservoir computers. In: Proceedings of the Genetic and Evolutionary Computation Conference, pp. 410–417 (2018)

    Google Scholar 

  4. Enel, P., Procyk, E., Quilodran, R., Dominey, P.: Reservoir computing properties of neural dynamics in prefrontal cortex. PLoS Comput. Biol. 12(6), e1004967 (2016)

    Article  Google Scholar 

  5. Gallicchio, C., Micheli, A., Pedrelli, L.: Deep reservoir computing: a critical experimental analysis. Neurocomputing 268, 87–99 (2017)

    Article  Google Scholar 

  6. Gallicchio, C., Micheli, A.: Tree echo state networks. Neurocomputing 101, 319–337 (2013)

    Article  Google Scholar 

  7. Gauthier, D.J., Bollt, E., Griffith, A., Barbosa, W.A.S.: Next generation reservoir computing. Nat. Commun. 12(1), 5564 (2021)

    Article  Google Scholar 

  8. Hoerzer, G.M., Legenstein, R., Maass, W.: Emergence of complex computational structures from chaotic neural networks through reward-modulated Hebbian learning. Cereb. Cortex 24(3), 677–690 (2014)

    Article  Google Scholar 

  9. Huang, G.B., Wang, D.H., Lan, Y.: Extreme learning machines: a survey. Int. J. Mach. Learn. Cyber. 2(2), 107–122 (2011)

    Article  Google Scholar 

  10. Jaeger, H.: The “echo state” approach to analysing and training recurrent neural networks. German National Research Center for Information Technology GMD Technical Report 148, 34, Bonn, Germany (2001)

    Google Scholar 

  11. Maass, W., Natschläger, T., Markram, H.: Real-time computing without stable states: a new framework for neural computation based on perturbations. Neural Comput. 14(11), 2531–2560 (2002)

    Article  Google Scholar 

  12. Pascanu, R., Jaeger, H.: A neurodynamical model for working memory. Neural Netw. 24(2), 199–207 (2011)

    Article  Google Scholar 

  13. Pedregosa, F., et al.: Scikit-learn: machine learning in Python. J. Mach. Learn. Res. 12, 2825–2830 (2011)

    MathSciNet  MATH  Google Scholar 

  14. Pedrelli, L., Hinaut, X.: Hierarchical-task reservoir for online semantic analysis from continuous speech. IEEE TNNLS 1–10 (2021)

    Google Scholar 

  15. Schrauwen, B., Wardermann, M., Verstraeten, D., Steil, J.J., Stroobandt, D.: Improving reservoirs using intrinsic plasticity. Neurocomputing 71(7), 1159–1171 (2008)

    Article  Google Scholar 

  16. Shen, S., Baevski, A., Morcos, A.S., Keutzer, K., Auli, M., Kiela, D.: Reservoir transformers. arXiv preprint arXiv:2012.15045 (2020)

  17. Shrivastava, H., Garg, A., Cao, Y., Zhang, Y., Sainath, T.: Echo state speech recognition. In: ICASSP, pp. 5669–5673. IEEE (2021)

    Google Scholar 

  18. Strock, A., Hinaut, X., Rougier, N.P.: A robust model of gated working memory. Neural Comput. 32(1), 153–181 (2020)

    Article  MathSciNet  Google Scholar 

  19. Sun, C., Song, M., Hong, S., Li, H.: A review of designs and applications of echo state networks. arXiv preprint arXiv:2012.02974 (2020)

  20. Sussillo, D., Abbott, L.F.: Generating coherent patterns of activity from chaotic neural networks. Neuron 63(4), 544–557 (2009)

    Article  Google Scholar 

  21. Tanaka, G., et al.: Recent advances in physical reservoir computing: a review. Neural Netw. 115, 100–123 (2019)

    Article  Google Scholar 

  22. Triefenbach, F., Jalalvand, A., Schrauwen, B., Martens, J.: Phoneme recognition with large hierarchical reservoirs. In: NIPS, pp. 2307–2315 (2010)

    Google Scholar 

  23. Trouvain, N., Hinaut, X.: Canary song decoder: transduction and implicit segmentation with ESNs and LTSMs. In: Farkaš, I., Masulli, P., Otte, S., Wermter, S. (eds.) ICANN 2021. LNCS, vol. 12895, pp. 71–82. Springer, Cham (2021). https://doi.org/10.1007/978-3-030-86383-8_6

    Chapter  Google Scholar 

  24. Vaswani, A., et al.: Attention is all you need. In: NIPS, pp. 5998–6008 (2017)

    Google Scholar 

  25. Vlachas, P.R., et al.: Backpropagation algorithms and Reservoir Computing in Recurrent Neural Networks for the forecasting of complex spatiotemporal dynamics. Neural Netw. 126, 191–217 (2020)

    Article  Google Scholar 

  26. Xue, Y., Yang, L., Haykin, S.: Decoupled echo state networks with lateral inhibition. Neural Netw. 20(3), 365–376 (2007)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Xavier Hinaut .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Trouvain, N., Rougier, N., Hinaut, X. (2022). Create Efficient and Complex Reservoir Computing Architectures with ReservoirPy. In: Cañamero, L., Gaussier, P., Wilson, M., Boucenna, S., Cuperlier, N. (eds) From Animals to Animats 16. SAB 2022. Lecture Notes in Computer Science(), vol 13499. Springer, Cham. https://doi.org/10.1007/978-3-031-16770-6_8

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-16770-6_8

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-16769-0

  • Online ISBN: 978-3-031-16770-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics