Attractor dynamics in networks with learning rules inferred from data

Summary

Date: 
May 10, 2017 - 1:00pm - 2:00pm
Location: 
Northwest 243
About the Speaker
Name: 
Nicolas Brunel
Speaker Affiliation: 
University of Chicago

The attractor neural network (ANN) scenario is a popular scenario for
memory storage in cortex, but there is still a large gap between these models
and experimental data. In particular, the distributions of the learned patterns
and the learning rules are typically not constrained by data. In primate IT cortex, the distribution of
neuronal responses is close to lognormal, at odds with bimodal distributions of
firing rates used in the vast majority of theoretical studies. Furthermore, we
recently showed that differences between the statistics of responses to novel
and familiar stimuli are consistent with a Hebbian learning rule whose
dependence on post-synaptic firing rate is non-linear and dominated by depression.

We investigated the dynamics of a network model in which both
distributions of the learned patterns and the learning rules are inferred from
data. Using both mean field theory and simulations, we show that this network
exhibits attractor dynamics. Furthermore, we show that the storage capacity of
networks with learning rules inferred from data is close to the optimal
capacity, in the space of unsupervised Hebbian rules.

These networks lead to unimodal distributions of firing rates during the
delay period, consistent with data from delay match to sample experiments.
Finally, we show there is a transition to a chaotic phase at strong coupling
strength, with a extensive number of chaotic attractor states correlated with
the stored patterns.