CUED Publications database

Fundamental bounds on learning performance in neural circuits.

Raman, DV and Rotondo, AP and O'Leary, T (2019) Fundamental bounds on learning performance in neural circuits. Proceedings of the National Academy of Sciences of the United States of America, 116. pp. 10537-10546. ISSN 1091-6490

Full text not available from this repository.


How does the size of a neural circuit influence its learning performance? Larger brains tend to be found in species with higher cognitive function and learning ability. Intuitively, we expect the learning capacity of a neural circuit to grow with the number of neurons and synapses. We show how adding apparently redundant neurons and connections to a network can make a task more learnable. Consequently, large neural circuits can either devote connectivity to generating complex behaviors or exploit this connectivity to achieve faster and more precise learning of simpler behaviors. However, we show that in a biologically relevant setting where synapses introduce an unavoidable amount of noise, there is an optimal size of network for a given task. Above the optimal network size, the addition of neurons and synaptic connections starts to impede learning performance. This suggests that the size of brain circuits may be constrained by the need to learn efficiently with unreliable synapses and provides a hypothesis for why some neurological learning deficits are associated with hyperconnectivity. Our analysis is independent of specific learning rules and uncovers fundamental relationships between learning rate, task performance, network size, and intrinsic noise in neural circuits.

Item Type: Article
Uncontrolled Keywords: artificial intelligence learning neural network optimization synaptic plasticity
Divisions: Div F > Control
Depositing User: Cron Job
Date Deposited: 14 May 2019 01:51
Last Modified: 09 Sep 2021 01:09
DOI: 10.1073/pnas.1813416116