Abstract
Maintained by environmental fluxes, biological systems are thermodynamic processes that operate far from equilibrium without detailed-balanced dynamics. Yet, they often exhibit well defined nonequilibrium steady states (NESSs). More importantly, critical thermodynamic functionality arises directly from transitions among their NESSs, driven by environmental switching. Here, we identify the constraints on excess heat and dissipated work necessary to control a system that is kept far from equilibrium by background, uncontrolled “housekeeping” forces. We do this by extending the Crooks fluctuation theorem to transitions among NESSs, without invoking an unphysical dual dynamics. This and corresponding integral fluctuation theorems determine how much work must be expended when controlling systems maintained far from equilibrium. This generalizes thermodynamic feedback control theory, showing that Maxwellian Demons can leverage mesoscopic-state information to take advantage of the excess energetics in NESS transitions. We also generalize an approach recently used to determine the work dissipated when driving between functionally relevant configurations of an active energy-consuming complex system. Altogether, these results highlight universal thermodynamic laws that apply to the accessible degrees of freedom within the effective dynamic at any emergent level of hierarchical organization. By way of illustration, we analyze a voltage-gated sodium ion channel whose molecular conformational dynamics play a critical functional role in propagating action potentials in mammalian neuronal membranes.
Similar content being viewed by others
Notes
We ignore nonergodicity to simplify the development. The approach, though, handles nonergodicity just as well. However, distracting nuances arise that we do not wish to dwell on. For example, if the Markov chain has more than one attracting component for a particular x, then \(\varvec{\pi _x}\) is not unique, but can be constructed as any one of infinitely many probability-normalized linear superpositions of left eigenvectors of \(\mathsf{{T}}^{( \varvec{\mathcal {S} } \rightarrow \varvec{\mathcal {S} } | x)}\) associated with the eigenvalue of unity.
We start in a discrete-time setup, but later translate to continuous time.
The sign conventions adopted for Q, \(Q_\text {hk}\), and \(Q_\text {ex}\) are slightly inharmonious. We take Q and \(Q_\text {ex}\) to be energy that spontaneously flows into a system at fixed x, whereas we have chosen for \(Q_\text {hk}\) to have the opposite sign convention, for easy comparison to the literature. As a result, our quantities technically satisfy \(Q_\text {ex}= Q + Q_\text {hk}\), rather than \(Q_\text {ex}= Q-Q_\text {hk}\).
To be more precise, we write \(\Pr (\mathcal {S} _t = s | \mathcal {S} _0 \sim \varvec{\mu }_0 , x_{1:t+1})\) as \(\Pr _{\mathcal {S} _0 \sim \varvec{\mu }_0}(\mathcal {S} _t = s | x_{1:t+1})\), since the probability is not conditioned on \(\varvec{\mu }_0\)—a probability measure for subsequent state sequences. Here, we simply gloss over this nuance, later adopting the shorthand: \(\Pr (\mathcal {S} _t = s | \varvec{\mu }_0 , x_{1:t+1})\).
Since \(e^{\langle {-Y}\rangle } \le \langle {e^{-Y}}\rangle \) and \(\ln (a)\) is monotonically increasing for positive-valued \(a \in \{ e^{- \langle {Y}\rangle }, \langle {e^{-Y}}\rangle \}\).
The characteristic timescale is actually the net result of a combination of timescales from the inverse eigenvalues of G. Of necessity, these are the same timescales that determine the relaxation of the state distribution.
References
Crooks, G.E.: On thermodynamic and microscopic reversibility. J. Stat. Mech. 2011(7), P07008 (2011)
Crooks, G.E.: Nonequilibrium measurements of free energy differences for microscopically reversible Markovian systems. J. Stat. Phys. 90(5/6), 1481–1487 (1998)
Sagawa, T., Ueda, M.: Nonequilibrium thermodynamics of feedback control. Phys. Rev. E 85, 021104 (2012)
Wang, H., Oster, G.: Energy transduction in the F1 motor of ATP synthase. Nature 396(6708), 279–282 (1998)
Polettini, M., Esposito, M.: Irreversible thermodynamics of open chemical networks. I. Emergent cycles and broken conservation laws. J. Chem. Phys. 141(2), 07B610 (2014)
Landauer, R.: Statistical physics of machinery: forgotten middle-ground. Physica A 194(1–4), 551–562 (1993)
Qian, H.: Nonequilibrium steady-state circulation and heat dissipation functional. Phys. Rev. E 64, 022101 (2001)
Horsthemke, W.: Noise induced transitions. In: Vidal, C., Pacault, A. (eds.) Non-equilibrium Dynamics in Chemical Systems: Proceedings of the International Symposium. Bordeaux, France, pp. 150–160. Springer, Berlin, 3–7 Sept 1984
Lindner, B., Garcia-Ojalvo, J., Neiman, A., Schimansky-Geier, L.: Effects of noise in excitable systems. Phys. Rep. 392(6), 321–424 (2004)
Crutchfield, J.P., Aghamohammdi, C.: Not all fluctuations are created equal: Spontaneous variations in thermodynamic function. arXiv:1609.02519
Crutchfield, J.P.: The calculi of emergence: Computation, dynamics, and induction. Physica D 75, 11–54 (1994)
Riechers, P.M.: Exact results regarding the physics of complex systems via linear algebra, hidden Markov models, and information theory. PhD thesis, University of California, Davis (2016)
Seifert, U.: Stochastic thermodynamics, fluctuation theorems and molecular machines. Rep. Prog. Phys. 75(12), 126001 (2012)
Spinney, R., Ford, I.: Fluctuation relations: a pedagogical overview. Nonequilibrium Statistical Physics of Small Systems, pp. 3–56. Wiley, Weinheim (2013)
Oono, Y., Paniconi, M.: Steady state thermodynamics. Prog. Theor. Phys. Suppl. 130, 29–44 (1998)
Hatano, T., Sasa, S.: Steady-state thermodynamics of Langevin systems. Phys. Rev. Lett. 86, 3463–3466 (2001)
Trepagnier, E.H., Jarzynski, C., Ritort, F., Crooks, G.E., Bustamante, C.J., Liphardt, J.: Experimental test of Hatano and Sasa’s nonequilibrium steady-state equality. Proc. Natl. Acad. Sci. USA 101(42), 15038–15041 (2004)
Mandal, D., Jarzynski, C.: Analysis of slow transitions between nonequilibrium steady states. J. Stat. Mech. 2016(6), 063204 (2016)
Evans, D.J., Searles, D.J., Williams, S.R.: The Evans–Searles fluctuation theorem. Fundamentals of Classical Statistical Thermodynamics, pp. 49–64. Wiley, Weinheim (2016)
Evans, D.J., Cohen, E.G.D., Morriss, G.P.: Probability of second law violations in shearing steady states. Phys. Rev. Lett. 71(15), 2401 (1993)
Gallavotti, G., Cohen, E.G.D.: Dynamical ensembles in stationary states. J. Stat. Phys. 80(5), 931–970 (1995)
Evans, D.J., Searles, D.J., Rondoni, L.: Application of the Gallavotti-Cohen fluctuation relation to thermostated steady states near equilibrium. Phys. Rev. E 71(5), 056120 (2005)
Esposito, M., Van den Broeck, C.: Three detailed fluctuation theorems. Phys. Rev. Lett. 104, 090601 (2010)
Crooks, G.E.: Entropy production fluctuation theorem and the nonequilibrium work relation for free energy differences. Phys. Rev. E 60(3), 2721–2726 (1999)
Roldán, E., Parrondo, J.M.R.: Estimating dissipation from single stationary trajectories. Phys. Rev. Lett. 105(15), 150607 (2010)
Horowitz, J.M., Vaikuntanathan, S.: Nonequilibrium detailed fluctuation theorem for repeated discrete feedback. Phys. Rev. E 82, 061120 (2010)
England, J.L.: Statistical physics of self-replication. J. Chem. Phys. 139(12), 09B623 (2013)
England, J.L.: Dissipative adaptation in driven self-assembly. Nat. Nanotech. 10(11), 919–923 (2015)
Perunov, N., Marsland, R.A., England, J.L.: Statistical physics of adaptation. Phys. Rev. X 6(2), 021036 (2016)
Ruelle, D., Takens, F.: On the nature of turbulence. Commun. Math. Phys. 20(3), 167–192 (1971)
Mackey, M.C.: Time’s Arrow: The Origins of Thermodynamic Behavior. Dover Publications, New York (2003)
Cover, T.M., Thomas, J.A.: Elements of Information Theory, 2nd edn. Wiley, New York (2006)
Still, S., Sivak, D.A., Bell, A.J., Crooks, G.E.: Thermodynamics of prediction. Phys. Rev. Lett. 109, 120604 (2012)
Crutchfield, J.P., Young, K.: Inferring statistical complexity. Phys. Rev. Lett. 63, 105–108 (1989)
Lan, G., Sartori, P., Neumann, S., Sourjik, V., Tu, Y.: The energy-speed-accuracy trade-off in sensory adaptation. Nat. Phys. 8(5), 422–428 (2012)
Sartori, P., Granger, L., Lee, C.F., Horowitz, J.M.: Thermodynamic costs of information processing in sensory adaptation. PLoS Comput. Biol. 10(12), e1003974 (2014)
Hartich, D., Barato, A.C., Seifert, U.: Sensory capacity: an information theoretical measure of the performance of a sensor. Phys. Rev. E 93(2), 022116 (2016)
Esposito, M., Harbola, U., Mukamel, S.: Entropy fluctuation theorems in driven open systems: application to electron counting statistics. Phys. Rev. E 76, 031132 (2007)
Bagci, G.B., Tirnakli, U., Kurths, J.: The second law for the transitions between the non-equilibrium steady states. Phys. Rev. E 87, 032161 (2013)
Gaveau, B., Schulman, L.S.: A general framework for non-equilibrium phenomena: the master equation and its formal consequences. Phys. Lett. A 229(6), 347–353 (1997)
Sivak, D.A., Crooks, G.E.: Near-equilibrium measurements of nonequilibrium free energy. Phys. Rev. Lett. 108(15), 150601 (2012)
Deffner, S., Lutz, E.: Information free energy for nonequilibrium states (2012). arXiv:1201.3888
Qian, H.: Cycle kinetics, steady state thermodynamics and motors: a paradigm for living matter physics. J. Phys. 17(47), S3783 (2005)
Liepelt, S., Lipowsky, R.: Steady-state balance conditions for molecular motor cycles and stochastic nonequilibrium processes. Euro Phys. Let. 77(5), 50002 (2007)
Liepelt, S., Lipowsky, R.: Kinesin’s network of chemomechanical motor cycles. Phys. Rev. Lett. 98, 258102 (2007)
Crooks, G.E.: Path-ensemble averages in systems driven far from equilibrium. Phys. Rev. E 61(3), 2361–2366 (2000)
Chernyak, V.Y., Chertkov, M., Jarzynski, C.: Path-integral analysis of fluctuation theorems for general Langevin processes. J. Stat. Mech. 2006(8), P08001 (2006)
Harris, R.J., Schutz, G.M.: Fluctuation theorems for stochastic dynamics. J. Stat. Mech. 2007, P07020 (2007)
Seifert, U.: Entropy production along a stochastic trajectory and an integral fluctuation theorem. Phys. Rev. Lett. 95, 040602 (2005)
Lahiri, S., Jayannavar, A.M.: Fluctuation theorems for excess and housekeeping heat for underdamped Langevin systems. Euro Phys. J. B 87(9), 195 (2014)
Jarzynski, C.: Nonequilibrium equality for free energy differences. Phys. Rev. Lett. 78(14), 2690–2693 (1997)
Speck, T., Seifert, U.: Integral fluctuation theorem for the housekeeping heat. J. Phys. A 38(34), L581 (2005)
Vaikuntanathan, S., Jarzynski, C.: Dissipation and lag in irreversible processes. Europhys. Lett. 87(6), 60005 (2009)
O’Leary, T., Williams, A.H., Franci, A., Marder, E.: Cell types, network homeostasis, and pathological compensation from a biologically plausible ion channel expression model. Neuron 82(4), 809–821 (2014)
Turrigiano, G.G., Nelson, S.B.: Homeostatic plasticity in the developing nervous system. Nat. Rev. Neurosci. 5(2), 97–107 (2004)
Sengupta, B., Stemmler, M.B.: Power consumption during neuronal computation. Proc. IEEE 102(5), 738–750 (2014)
Howarth, C., Peppiatt-Wildman, C.M., Attwell, D.: The energy use associated with neural computation in the cerebellum. J. Cereb. Blood Flow Metab. 30(2), 403–414 (2010)
Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience Series, revised edn. MIT Press, Boston (2005)
Attwell, D., Laughlin, S.B.: An energy budget for signaling in the grey matter of the brain. J. Cereb. Blood Flow Metab. 21(10), 1133–1145 (2001)
Izhikevich, E.M.: Dynamical Systems in Neuroscience. Computational Neuroscience Series. MIT Press, Boston (2010)
Rieke, F., Warland, D., de Ruyter van Steveninck, R., Bialek, W.: Spikes: Exploring the Neural Code. Bradford Books, New York (1999)
Hodgkin, A.L., Huxley, A.F.: A quantitative description of membrane current and its application to conduction and excitation in nerve. J. Physio 117(4), 500 (1952)
Patlak, J.: Molecular kinetics of voltage-dependent Na\(^+\) channels. Physiol. Rev. 71(4), 1047–1080 (1991)
Crutchfield, J.P., Ellison, C.J., Riechers, P.M.: Exact complexity: Spectral decomposition of intrinsic computation. Phys. Lett. A 380(9–10), 998–1002 (2016)
Riechers, P.M., Crutchfield, J.P.: Beyond the spectral theorem: Decomposing arbitrary functions of nondiagonalizable operators (2016). arXiv:1607.06526 [math-ph]
Lacoste, D., Lau, A.W.C., Mallick, K.: Fluctuation theorem and large deviation function for a solvable model of a molecular motor. Phys. Rev. E 78, 011915 (2008)
Murashita, Y., Funo, K., Ueda, M.: Nonequilibrium equalities in absolutely irreversible processes. Phys. Rev. E 90, 042110 (2014)
Altaner, B., Wachtel, A., Vollmer, J.: Fluctuating currents in stochastic thermodynamics II: energy conversion and nonequilibrium response in kinesin models (2015). arXiv:1504.03648
Colquhoun, D., Hawkes, A.G.: Relaxation and fluctuations of membrane currents that flow through drug-operated channels. Proc. R. Soc. Lond. B 199(1135), 231–262 (1977)
Lahiri, S., Ganguli, S.: A memory frontier for complex synapses. In: Burges C.J.C., Bottou L., Welling M., Ghahramani Z., Weinberger K.Q. (eds.) Advances in Neural Information Processing System 26, pp. 1034–1042. Curran Associates, Inc. (2013)
Shen, Q., Hao, Q., Gruner, S.M.: Macromolecular phasing. Phys. Today 59(3), 46–52 (2006)
Anderson, P.W.: More is different. Science 177(4047), 393–396 (1972)
Boyd, A.B., Mandal, D., Crutchfield, J.P.: Correlation-powered information engines and the thermodynamics of self-correction. Phys. Rev. E 95(1), 012152 (2017)
Boyd, A.B., Mandal, D., Crutchfield, J.P.: Leveraging environmental correlations: The thermodynamics of requisite variety. J. Stat. Phys. 167(6), 1555–1585 (2017)
Speck, T., Seifert, U.: The Jarzynski relation, fluctuation theorems, and stochastic thermodynamics for non-Markovian processes. J. Stat. Mech. 09, L09002 (2007)
Acknowledgements
We thank Tony Bell, Alec Boyd, Gavin Crooks, Sebastian Deffner, Chris Jarzynski, John Mahoney, Dibyendu Mandal, and Adam Rupe for useful feedback. We thank the Santa Fe Institute for its hospitality during visits. JPC is an SFI External Faculty member. This material is based upon work supported by, or in part by, the U. S. Army Research Laboratory and the U. S. Army Research Office under contracts W911NF-12-1-0234, W911NF-13-1-0390, and W911NF-13-1-0340.
Author information
Authors and Affiliations
Corresponding author
Appendices
Appendix A: Extension to Non-Markovian Instantaneous Dynamics
Commonly, theoretical developments assume state-to-state transitions are instantaneously Markovian given the input. This assumption works well for many cases, but fails in others with strong coupling between system and environment. Fortunately, we can straightforwardly generalize the results of stochastic thermodynamics by considering a system’s observable states to be functions of latent variables \({\varvec{{\mathcal {R}}}}\). The goal in the following is to highlight the necessary changes, so that it should be relatively direct to adapt our derivations to the non-Markovian dynamics. See Ref. [75] for an alternative approach to addressing non-Markovian dynamics.
1.1 Latent States, System States, and Their Many Distributions
Even with constant environmental input, the dynamic over a system’s states need not obey detailed balance nor exhibit any finite Markov order. We assume that the classical observed states \( \varvec{\mathcal {S} } \) are functions \(f: {\varvec{{\mathcal {R}}}}\rightarrow \varvec{\mathcal {S} } \) of a latent Markov chain. We also assume that the stochastic transitions among latent states are determined by the current environmental input \(x \in \mathcal {X}\), which can depend arbitrarily on all previous input and system-state history. The Perron–Frobenius theorem guarantees that there is a stationary distribution over latent states associated with each fixed input x; the function of the Markov chain maps this stationary distribution over latent states into the stationary distribution over system states. These are the stationary distributions associated with system NESSs.
We assume too that the \({\varvec{{\mathcal {R}}}}\)-to-\({\varvec{{\mathcal {R}}}}\) transitions are Markovian given the input. However, different inputs induce different Markov chains over the latent states. This can be described by a (possibly infinite) set of input-conditioned transition matrices over the latent state set \({\varvec{{\mathcal {R}}}}\): \(\{ \mathsf{{T}}^{({\varvec{{\mathcal {R}}}}\rightarrow {\varvec{{\mathcal {R}}}}| x)} \}_{x \in \mathcal {X}}\), where \(\mathsf{{T}}^{({\varvec{{\mathcal {R}}}}\rightarrow {\varvec{{\mathcal {R}}}}| x)}_{i,j} = \Pr ({\mathcal {R}}_{t} = r^j | {\mathcal {R}}_{t-1} = r^i , X_t = x)\). Probabilities regarding actual state paths can be obtained from the latent-state-to-state transition dynamic together with the observable-state projectors, which we now define.
We denote distributions over the latent states as bold Greek symbols, such as \(\varvec{\mu }\). As in the main text, it is convenient to cast \(\varvec{\mu }\) as a row-vector, in which case it appears as the bra \(\langle {\varvec{\mu }}|\). The distribution over latent states \({\varvec{{\mathcal {R}}}}\) implies a distinct distribution over observable states \( \varvec{\mathcal {S} } \). A sequence of driving inputs updates the distribution: \(\varvec{\mu }_{t+n}(\varvec{\mu }_{t}, x_{t:t+n})\). In particular:
(Recall that time indexing is denoted by subscript ranges n : m that are left-inclusive and right-exclusive.) An infinite driving history induces a distribution over the state space, and \(\varvec{\pi _x}\) is the specific steady-state distribution over \({\varvec{{\mathcal {R}}}}\) induced by tireless repetition of the single environmental drive x. Explicitly:
Usefully, \(\varvec{\pi _x}\) can also be found as the left eigenvector of \(\mathsf{{T}}^{({\varvec{{\mathcal {R}}}}\rightarrow {\varvec{{\mathcal {R}}}}| x)}\) associated with the eigenvalue of unity:
The physically relevant steady-state probabilities are this vector’s projection onto observable states: \(\pi _x(s) = \langle {\varvec{\pi _{x}} | s}\rangle \), where \(|{s}\rangle = |{\delta _{s,f(r)}}\rangle \) has a vector-representation in the latent-state basis with elements of all 0s except 1s where the latent state maps to the observable state s.
Assuming latent-state-to-state transitions are Markovian allows the distribution \(\varvec{\mu }\) over these latent states to summarize the causal relevance of the entire driving history.
1.2 Implications
A semi-infinite history induces a particular distribution over system latent states and implies another particular distribution over its observable states. This can be usefully recast in terms of the “start” (or initial) distribution \(\varvec{\mu }_0\) induced by the path \(x_{-\infty :1}\) and the driving history \(x_{1:t+1}\) since then, giving the entropy of the induced state distribution:
Or, employing the new distribution and the driving history since then, the path entropy (functional of state and driving history) can be expressed simply in terms of the current distribution over latent states and the candidate observable state s:
Averaging the path-conditional state entropy over observable states again gives a genuine input-conditioned Shannon state entropy:
It is again easy to show that the state-averaged path entropy \(k_\text {B}{\text {H}}[\mathcal {S} _t | {\overleftarrow{x}}_{t}]\) is an extension of the system’s steady-state nonequilibrium entropy. In steady-state, the state-averaged path entropy reduces to:
The nonsteady-state addition to free energy is:
Averaging over observable states this becomes the relative entropy:
which is always nonnegative.
Using this setup and decomposing:
in analogy with Eq. (25), it is straightforward to extend the remaining results of the main body to the setting in which observed states are functions of a Markov chain. Notably, the path dependencies pick up new contributions from non-Markovity. Also, knowledge of distributions over latent states provides a thermodynamic advantage to Maxwellian Demons.
Appendix B: Integral Fluctuation Theorems with Auxiliary Variables
Recall that we quantify how much the auxiliary variable independently informs the state sequence via the nonaveraged conditional mutual information:
Note that averaging over the input, state, and auxiliary sequences gives the familiar conditional mutual information:
(Averaging over distributions is the same as being given the distribution, since the distribution over distributions is assumed to be peaked at \({\varvec{\mu }}_\text {F}\).)
Noting that:
where , we have the integral fluctuation theorem (IFT):
Notably, this relation holds arbitrarily far from equilibrium and allows for the starting and ending distributions to both be nonsteady-state.
It is tempting to conclude that the revised Second Law of Thermodynamics should read:
which includes the effects of both irreversibility and conditional mutual information between state-sequence and auxiliary sequence, given input-sequence. However, we expect that \(\left\langle Q_\text {hk}\right\rangle > 0\), so Eq. (B1) is not the strongest bound derivable. Dropping \(\Psi \) from the IFT still yields a valid equality. However, the derivation runs differently since it depends on the normalization of the dual dynamic—that is, using quantities of the form:
These are mathematically-sound transition probabilities \(\widetilde{\mathsf{{T}}}_{s^n, s^{n-1}}^{( \varvec{\mathcal {S} } \rightarrow \varvec{\mathcal {S} } | x^n)}\), but only of a nonphysical artificial dynamic. Although IFTs with \(\Psi \) may be useful for other reasons, it is the non-\(\Psi \) IFTs that seem to yield the tighter bound for the revised Second Laws of information thermodynamics without detailed balance.
Rights and permissions
About this article
Cite this article
Riechers, P.M., Crutchfield, J.P. Fluctuations When Driving Between Nonequilibrium Steady States. J Stat Phys 168, 873–918 (2017). https://doi.org/10.1007/s10955-017-1822-y
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10955-017-1822-y