CUED Publications database

Fundamental bounds on learning performance in neural circuits

Raman, DV and Rotondo, AP and O'Leary, T (2019) Fundamental bounds on learning performance in neural circuits. Proceedings of the National Academy of Sciences of the United States of America, 116. pp. 10537-10546. ISSN 0027-8424

Full text not available from this repository.

Abstract

© 2019 National Academy of Sciences. All rights reserved. How does the size of a neural circuit influence its learning performance? Larger brains tend to be found in species with higher cognitive function and learning ability. Intuitively, we expect the learning capacity of a neural circuit to grow with the number of neurons and synapses. We show how adding apparently redundant neurons and connections to a network can make a task more learnable. Consequently, large neural circuits can either devote connectivity to generating complex behaviors or exploit this connectivity to achieve faster and more precise learning of simpler behaviors. However, we show that in a biologically relevant setting where synapses introduce an unavoidable amount of noise, there is an optimal size of network for a given task. Above the optimal network size, the addition of neurons and synaptic connections starts to impede learning performance. This suggests that the size of brain circuits may be constrained by the need to learn efficiently with unreliable synapses and provides a hypothesis for why some neurological learning deficits are associated with hyperconnectivity. Our analysis is independent of specific learning rules and uncovers fundamental relationships between learning rate, task performance, network size, and intrinsic noise in neural circuits.

Item Type: Article
Uncontrolled Keywords: q-bio.NC q-bio.NC
Subjects: UNSPECIFIED
Divisions: Div F > Control
Depositing User: Cron Job
Date Deposited: 12 Jan 2019 20:41
Last Modified: 12 Dec 2019 02:00
DOI: 10.1073/pnas.1813416116