arXiv Analytics

Sign in

arXiv:1505.04368 [q-bio.NC]AbstractReferencesReviewsResources

Measuring integrated information from the decoding perspective

Masafumi Oizumi, Shun-ichi Amari, Toru Yanagawa, Naotaka Fujii, Naotsugu Tsuchiya

Published 2015-05-17Version 1

Accumulating evidence indicates that the capacity to integrate information in the brain is a prerequisite for consciousness. Integrated Information Theory (IIT) of consciousness provides a mathematical approach to quantifying the information integrated in a system, called integrated information, $\Phi$. Integrated information is defined theoretically as the amount of information a system generates as a whole, above and beyond the sum of the amount of information its parts independently generate. IIT predicts that the amount of integrated information in the brain should reflect levels of consciousness. Empirical evaluation of this theory requires computing integrated information from neural data acquired from experiments, although difficulties with using the original measure $\Phi$ precludes such computations. Although some practical measures have been previously proposed, we found that these measures fail to satisfy the theoretical requirements as a measure of integrated information. Measures of integrated information should satisfy the lower and upper bounds as follows: The lower bound of integrated information should be 0 when the system does not generate information (no information) or when the system comprises independent parts (no integration). The upper bound of integrated information is the amount of information generated by the whole system and is realized when the amount of information generated independently by its parts equals to 0. Here we derive the novel practical measure $\Phi^*$ by introducing a concept of mismatched decoding developed from information theory. We show that $\Phi^*$ is properly bounded from below and above, as required, as a measure of integrated information. We derive the analytical expression $\Phi^*$ under the Gaussian assumption, which makes it readily applicable to experimental data.

Related articles: Most relevant | Search more
arXiv:2305.09826 [q-bio.NC] (Published 2023-05-16)
Upper bounds for integrated information
arXiv:1501.01860 [q-bio.NC] (Published 2015-01-08)
Applications of Information Theory to Analysis of Neural Data
arXiv:1806.09373 [q-bio.NC] (Published 2018-06-25)
Measuring Integrated Information: Comparison of Candidate Measures in Theory and Simulation