Neurobiology
print


Breadcrumb Navigation


Content

Re-encoding of Associations by Recurrent Plasticity Increases Memory Capacity

Front Synaptic Neurosci 6: 13. doi: 10.3389/fnsyn.2014.00013. eCollection 2014.

Authors/Editors: Medina C
Leibold C
Publication Date: 2014
Type of Publication: Journal Articles 2001 - 2017

Abstract

Recurrent networks have been proposed as a model of associative memory. In such models, memory items are stored in the strength of connections between neurons. These modifiable connections or synapses constitute a shared resource among all stored memories, limiting the capacity of the network. Synaptic plasticity at different time scales can play an important role in optimizing the representation of associative memories, by keeping them sparse, uncorrelated and non-redundant. Here, we use a model of sequence memory to illustrate how plasticity allows a recurrent network to self-optimize by gradually re-encoding the representation of its memory items. A learning rule is used to sparsify large patterns, i.e., patterns with many active units. As a result, pattern sizes become more homogeneous, which increases the network's dynamical stability during sequence recall and allows more patterns to be stored. Last, we show that the learning rule allows for online learning in that it keeps the network in a robust dynamical steady state while storing new memories and overwriting old ones.

Related Links