arXiv Analytics

Sign in

arXiv:1412.0291 [q-bio.NC]AbstractReferencesReviewsResources

Bits from Biology for Computational Intelligence

Michael Wibral, Joseph T. Lizier, Viola Priesemann

Published 2014-11-30Version 1

Computational intelligence is broadly defined as biologically-inspired computing. Usually, inspiration is drawn from neural systems. This article shows how to analyze neural systems using information theory to obtain constraints that help identify the algorithms run by such systems and the information they represent. Algorithms and representations identified information-theoretically may then guide the design of biologically inspired computing systems (BICS). The material covered includes the necessary introduction to information theory and the estimation of information theoretic quantities from neural data. We then show how to analyze the information encoded in a system about its environment, and also discuss recent methodological developments on the question of how much information each agent carries about the environment either uniquely, or redundantly or synergistically together with others. Last, we introduce the framework of local information dynamics, where information processing is decomposed into component processes of information storage, transfer, and modification -- locally in space and time. We close by discussing example applications of these measures to neural data and other complex systems.

Related articles: Most relevant | Search more
arXiv:1501.01860 [q-bio.NC] (Published 2015-01-08)
Applications of Information Theory to Analysis of Neural Data
arXiv:2203.10810 [q-bio.NC] (Published 2022-03-21)
Information-theoretic analyses of neural data to minimize the effect of researchers' assumptions in predictive coding studies
arXiv:1902.11233 [q-bio.NC] (Published 2019-02-28)
The principles of adaptation in organisms and machines I: machine learning, information theory, and thermodynamics