Hostname: page-component-745bb68f8f-cphqk Total loading time: 0 Render date: 2025-02-06T18:59:52.530Z Has data issue: false hasContentIssue false

Efficiency, information theory, and neural representations

Published online by Cambridge University Press:  30 August 2019

Joseph T. Devlin
Affiliation:
Centre for Speech and Language, Department of Experimental Psychology, University of Cambridge, Cambridge, CB2 3EB, England jtd21@cam.ac.uksam26@cam.ac.ukrpr23@cam.ac.ukcsl.psychol.cam.ac.uk/~jdevlin
Matt H. Davis
Affiliation:
Medical Research Council, Cognition and Brain Sciences Unit, Cambridge, CB2 2EF England matt.davis@mrc-cbu.cam.26.ac
Stuart A. McLelland
Affiliation:
Centre for Speech and Language, Department of Experimental Psychology, University of Cambridge, Cambridge, CB2 3EB, England jtd21@cam.ac.uksam26@cam.ac.ukrpr23@cam.ac.ukcsl.psychol.cam.ac.uk/~jdevlin
Richard P. Russell
Affiliation:
Centre for Speech and Language, Department of Experimental Psychology, University of Cambridge, Cambridge, CB2 3EB, England jtd21@cam.ac.uksam26@cam.ac.ukrpr23@cam.ac.ukcsl.psychol.cam.ac.uk/~jdevlin
Rights & Permissions [Opens in a new window]

Abstract

Core share and HTML view are not available for this content. However, as you have access to this content, a full PDF is available via the ‘Save PDF’ action button.

We contend that if efficiency and reliability are important factors in neural information processing then distributed, not localist, representations are “evolution's best bet.” We note that distributed codes are the most efficient method for representing information, and that this efficiency minimizes metabolic costs, providing adaptive advantage to an organism.

Type
Brief Report
Copyright
2000 Cambridge University Press