arXiv Analytics

Sign in

arXiv:1501.01860 [q-bio.NC]AbstractReferencesReviewsResources

Applications of Information Theory to Analysis of Neural Data

Simon R. Schultz, Robin A. A. Ince, Stefano Panzeri

Published 2015-01-08Version 1

Information theory is a practical and theoretical framework developed for the study of communication over noisy channels. Its probabilistic basis and capacity to relate statistical structure to function make it ideally suited for studying information flow in the nervous system. It has a number of useful properties: it is a general measure sensitive to any relationship, not only linear effects; it has meaningful units which in many cases allow direct comparison between different experiments; and it can be used to study how much information can be gained by observing neural responses in single trials, rather than in averages over multiple trials. A variety of information theoretic quantities are commonly used in neuroscience - (see entry "Definitions of Information-Theoretic Quantities"). In this entry we review some applications of information theory in neuroscience to study encoding of information in both single neurons and neuronal populations.

Comments: 8 pages, 2 figures
Journal: Encyclopedia of Computational Neuroscience 2014, pp 1-6
Categories: q-bio.NC
Related articles: Most relevant | Search more
arXiv:1710.11279 [q-bio.NC] (Published 2017-10-31)
The Importance of Forgetting: Limiting Memory Improves Recovery of Topological Characteristics from Neural Data
arXiv:1902.11233 [q-bio.NC] (Published 2019-02-28)
The principles of adaptation in organisms and machines I: machine learning, information theory, and thermodynamics
arXiv:1501.01854 [q-bio.NC] (Published 2015-01-08)
Summary of Information Theoretic Quantities