From synapse to network: models of information storage and retrieval in brain networks

by prof. Nicolas Brunel, Duke University

When: June 16th, 2022 – 3:00 pm

Where: VIMM Meeting room – Recording available on Mediaspace

Abstract: Brains have a remarkable ability to store information about the external world, on time scales that range from seconds to the lifetime of an animal. What are the mechanisms by which information is stored in the brain, and how is stored information retrieved from memory? One of the central hypothesis of neuroscience is that information is stored through synaptic plasticity – modifications of synaptic connectivity between neurons. Theoretical models have explored the impact of such synaptic plasticity mechanisms on network dynamics. One scenario, in which synaptic changes are predominantly temporally symmetric, leads to the creation of fixed point attractor states of the dynamics of the network, one for each item stored in memory. Another scenario, in which changes have a strong temporally asymmetric component, leads to the creation of sequences of network activity. In this talk, I will present recent instantiations of these models, that are both simple enough to enable mean-field calculations, but also detailed enough to enable detailed comparisons with experimental data. I will also show how heterogeneities in synaptic plasticity can allow networks to flexibly switch from the fixed point attractor regime to the sequence regime, and to vary the speed at which sequences are retrieved.

Short bio: Nicolas Brunel is professor of Neurobiology and Physics at Duke University. He is also member of the Center for Cognitive Neuroscience and Faculty Network Member of the Duke Institute for Brain Sciences. He uses theoretical models of brain systems to investigate how they process and learn information from their inputs. His current work, in collaboration with various experimental groups, focuses on the mechanisms of learning and memory, from the synapse to the network level. Using methods from statistical physics, he has shown recently that the synaptic connectivity of a network that maximizes storage capacity reproduces two key experimentally observed features: low connection probability and strong overrepresentation of bidirectionally connected pairs of neurons. He has also inferred ‘synaptic plasticity rules’ (a mathematical description of how synaptic strength depends on the activity of pre and post-synaptic neurons) from data, and shown that networks endowed with a plasticity rule inferred from data have a storage capacity that is close to the optimal bound.