On the Thermodynamics of Information Processing: From Information Geometry to Souriau’s Theory of Heat
Information processing is bounded by thermodynamic requisites. Efficiency aspects of biological information processing in the midst of fluctuations are addressed from a geometric perspective. We inquire into entropy-driven systems designed to regulate life-sustaining processes. Such systems are characterized by a circular autonomy rooted on a sophisticated metabolic infratructure that optimizes the entropy production costs tied to the process of information and self-maintenance. The entropy produced is governed by an irreversible Fokker-Planck equation measuring the heat dissipated to the environment as information is erased. This dissipated heat is the intrinsic cost of logically irreversible operations that take place at the end of a thermodynamic cycle. An optimal dissipation protocol is a geodesic in the Riemannian univariate statistical manifold. The Riemannian structure is naturally obtained by integrating fluctuations into the axioms of thermodynamics. In this structure protocol efficiency is evaluated with respect to the geometric invariants of the statistical manifold - the Fisher information metric and the Amari-Chentsov tensor. Finally, we present generalizations of information geometry based on Souriau's symplectic interpretation of thermodynamics and link them to the previous results on heat dissipation. A remarkable result stemming from this perspective is the equivalence between the Souriau-Fisher metric and a generalized heat capacity. This equivalence is interpreted in the context of information processing and complexity.