arXiv Analytics

Sign in

arXiv:1307.6092 [cond-mat.stat-mech]AbstractReferencesReviewsResources

Role of Mutual Information in Entropy Production under Information Exchanges

Takahiro Sagawa, Masahito Ueda

Published 2013-07-23, updated 2013-09-30Version 2

We relate the information exchange between two stochastic systems to the nonequilibrium entropy production in the whole system. By deriving a general formula that decomposes the total entropy production into the thermodynamic and informational parts, we obtain nonequilibrium equalities such as the fluctuation theorem in the presence of information processing. Our results apply not only to situations under measurement and feedback control, but also to those under multiple information exchanges between two systems, giving the fundamental energy cost for information processing and elucidating the thermodynamic and informational roles of a memory in information processing. We clarify a dual relationship between measurement and feedback.

Comments: Submitted to New Journal of Physics for the Focus Issue on "Nonequilibrium Fluctuation Relations: From Classical to Quantum"
Journal: New J. Phys. 5, 125012 (2013)
Categories: cond-mat.stat-mech
Related articles: Most relevant | Search more
Fundamental energy cost of finite-time computing
Typical Positivity of Nonequilibrium Entropy Production for Pure States
Shortcuts to adiabaticity applied to nonequilibrium entropy production: An information geometry viewpoint