Abstract
Working memory is the ability to maintain and manipulate information. We introduce a method based on conceptors that allows us to manipulate information stored in the dynamics (latent space) of a gated working memory model. This latter model is based on a reservoir: a random recurrent network with trainable readouts. It is trained to hold a value in memory given an input stream when a gate signal is on and to maintain this information when the gate is off. The memorized information results in complex dynamics inside the reservoir that can be faithfully captured by a conceptor. Such conceptors allow us to explicitly manipulate this information in order to perform various, but not arbitrary, operations. In this work, we show (1) how working memory can be stabilized or discretized using such conceptors, (2) how such conceptors can be linearly combined to form new memories, and (3) how these conceptors can be extended to a functional role. These preliminary results suggest that conceptors can be used to manipulate the latent space of the working memory even though several results we introduce are not as intuitive as one would expect.
Similar content being viewed by others
Notes
Offline with ridge regression
References
Bao J, et al. Action recognition based on conceptors of skeleton joint trajectories. Rev Fac Ing 2016;31.4:11–22.
Bartle M, et al. 2019. Recognizing human internal states: a conceptor-based approach. arXiv:1909.04747[cs.HC].
Bouchacourt F, Buschman TJ. A flexible model of working memory. Neuron 2019;103.1: 147–160.e8. https://doi.org/10.1016/j.neuron.2019.04.020.
Brock A, et al. 2016. Neural photo editing with introspective adversarial networks. arXiv:1609.07093.
Cho K, et al. Learning phrase representations using RNN encoder–decoder for statistical machine translation. Proceedings of the 2014 conference on empirical methods in natural language processing (EMNLP). Doha, Qatar: Association for Computational Linguistics; 2014. p. 1724–1734. http://www.aclweb.org/anthology/D14-1179.
DePasquale B, et al. Full-FORCE: a target-based method for training recurrent networks. PLOS ONE 13.2. In: Chacron MJ, editors; 2018. p. e0191527. https://doi.org/10.1371/journal.pone.0191527.
Gast R, e al. Encoding and decoding dynamic sensory signals with recurrent neural networks: an application of conceptors to birdsongs. BioRxiv; 2017. https://doi.org/10.1101/131052.
He X, Jaeger H. Overcoming catastrophic interference using conceptor-aided back-propagation. International conference on learning representations; 2018. https://openreview.net/forum?id=B1al7jg0b.
Jaeger H. 2001. The echo state approach to analysing and training recurrent neural networks. Tech. rep. 148. Bonn, Germany: German National Research Center for Information Technology GMD.
Jaeger H. 2014. Controlling recurrent neural networks by conceptors. arXiv:1403.3369.
Jaeger H. Using conceptors to manage neural long-term memories for temporal patterns. Journal of Machine Learning Research 2017;18.13:1–43.
Lim S, Goldman MS. Balanced cortical microcircuitry for maintaining information in working memory. Nature Neuroscience 2013;16.9:1306–1314. https://doi.org/10.1038/nn.3492.
Liu T, Ungar L, Sedoc J. 2019. Continual learning for sentence representations using conceptors. arXiv:1904.09187.
Manohar SG, et al. Neural mechanisms of attending to items in working memory. Neuroscience & Biobehavioral Reviews 2019;101:1–12. https://doi.org/10.1016/j.neubiorev.2019.03.017.
Masse NY, et al. Circuit mechanisms for the maintenance and manipulation of information in working memory. Nature Neuroscience 2019;22.7:1159–1167. https://doi.org/10.1038/s41593-019-0414-3.
Mikolov T, et al. Distributed representations of words and phrases and their compositionality. Proc. of NIPS; 2013. p. 3111–3119.
Mongillo G, Barak O, Tsodyks M. Synaptic theory of working memory. Science 2008; 319.5869:1543–1546. https://doi.org/10.1126/science.1150769.
Mossakowski T, Diaconescu R, Glauer M. Towards logics for neural conceptors. J Appl Logics 2019;6.4:725–744.
Nachstedt T, Tetzla C. 2017. Working memory requires a combination of transient and attractor-dominated dynamics to process unreliably timed inputs. Scientific Reports 7.1. https://doi.org/10.1038/s41598-017-02471-z.
Schuessler F, et al. Dynamics of random recurrent networks with correlated low-rank structure. Physical Review Research 2020;2.1:013111.
Stokes MG. Activity-silent’ working memory in prefrontal cortex: a dynamic coding framework. Trends in Cognitive Sciences 2015;19.7:394–405. https://doi.org/10.1016/j.tics.2015.05.004.
Strock A, Hinaut X, Rougier NP. A robust model of gated working memory. Neural Computation 2020;32.1:153–181. https://doi.org/10.1162/necon_an_01249.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of Interest
The authors declare they have no conflict of interest.
Additional information
Ethical Approval
This article does not contain any studies with human participants or animals performed by any of the authors.
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Nicolas P. Rougier and Xavier Hinaut contributed equally to this work.
This article is part of the Topical Collection on Trends in Reservoir Computing
Guest Editors: Claudio Gallicchio, Alessio Micheli, Simone Scardapane, Miguel C. Soriano
Rights and permissions
About this article
Cite this article
Strock, A., Rougier, N.P. & Hinaut, X. Latent Space Exploration and Functionalization of a Gated Working Memory Model Using Conceptors. Cogn Comput 15, 1485–1496 (2023). https://doi.org/10.1007/s12559-020-09797-3
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s12559-020-09797-3