Beyond the Bathtub: How Memory Shapes the Thermodynamics of Life

Author: Denis Avetisyan


A new theoretical framework, leveraging the Keldysh formalism, expands our understanding of entropy dynamics in living systems to incorporate the crucial role of non-Markovian effects.

System entropy evolves across distinct life stages-development, maturity, aging, and death-with the rate of recovery from reversible perturbations influenced by memory effects; specifically, longer memory times <span class="katex-eq" data-katex-display="false">\tau_c</span> delay return to environmental equilibrium, as evidenced by the broadened fluctuation ranges and non-Markovian behavior observed when comparing <span class="katex-eq" data-katex-display="false">\tau_c = 0</span> (blue) to <span class="katex-eq" data-katex-display="false">\tau_c = 2</span> (red).
System entropy evolves across distinct life stages-development, maturity, aging, and death-with the rate of recovery from reversible perturbations influenced by memory effects; specifically, longer memory times \tau_c delay return to environmental equilibrium, as evidenced by the broadened fluctuation ranges and non-Markovian behavior observed when comparing \tau_c = 0 (blue) to \tau_c = 2 (red).

This review applies the Keldysh formalism to extend the entropy bathtub model, providing a more complete thermodynamic description of aging and death by accounting for system-environment entanglement and non-Markovian correlations.

Living systems, far from thermodynamic equilibrium, present a paradox: their sustained order seems to defy the second law despite continuous energy exchange with their surroundings. This challenge is addressed in ‘Non-Markovian Entropy Dynamics in Living Systems from the Keldysh Formalism’, which develops a theoretical framework-rooted in the Keldysh formalism and stochastic thermodynamics-to model entropy dynamics beyond the limitations of Markovian assumptions. By explicitly incorporating environmental memory and system-environment correlations, the study reveals that violations of the fluctuation-dissipation relation signal active biological fluctuations and that environmental memory enhances both low-frequency fluctuations and entropy production-potentially linking these effects to aging and death. Could a more complete understanding of these non-Markovian effects ultimately redefine our understanding of life’s fundamental thermodynamic constraints?


The Illusion of Equilibrium: Beyond Markovian Simplifications

Conventional models in many scientific disciplines frequently operate under the assumption of a Markovian process, wherein a system’s future state is determined solely by its present state, effectively dismissing the influence of its history. While mathematically tractable and simplifying analysis, this approach represents a significant limitation when applied to genuinely complex systems. Real-world phenomena, from the folding of proteins to fluctuations in financial markets, often exhibit behavior demonstrably shaped by past events – a dependence that a purely Markovian framework cannot capture. This disregard for historical context can lead to predictions that diverge substantially from observed reality and a fundamental inability to fully understand the underlying mechanisms driving system evolution. The simplification, while useful in certain scenarios, ultimately sacrifices accuracy and completeness in the face of genuine complexity.

A vast range of systems, from the diffusion of molecules to the folding of proteins and the operation of sophisticated control mechanisms, demonstrably retain a ‘memory’ of their prior states. This isn’t a conscious recollection, but rather a dependence of present behavior on past conditions; the system’s trajectory isn’t solely determined by its current inputs. Consequently, traditional modeling approaches founded on the Markovian assumption – that the future is independent of the past given the present – frequently fall short. These limitations necessitate the adoption of non-Markovian frameworks, capable of capturing these historical dependencies, to accurately represent and predict the complex dynamics observed in physical, biological, and engineered realms. Failing to account for these memory effects can result in significant inaccuracies, obscuring underlying mechanisms and hindering effective control or prediction.

The simplification of complex systems as Markovian processes – those devoid of memory – frequently results in predictive failures and incomplete interpretations of observed behaviors. When a system’s current state is genuinely influenced by its history, disregarding this temporal dependence introduces substantial error; a model lacking memory cannot accurately forecast evolution or fully elucidate underlying mechanisms. This is particularly critical in fields like materials science, where the prior processing of a material dictates its current properties, or in neuroscience, where synaptic history profoundly impacts neuronal responses. Consequently, a failure to account for these non-Markovian effects doesn’t simply refine understanding – it fundamentally misrepresents the dynamics at play, potentially leading to flawed designs, ineffective interventions, and a restricted comprehension of the natural world.

The frequency-dependent fluctuation-dissipation ratio <span class="katex-eq" data-katex-display="false">X_{AB}(\omega)</span> reveals that developmental stages exhibit suppressed fluctuations, mature stages maintain equilibrium, and aging stages experience amplified noise and reduced dissipation efficiency, all modeled using an exponential memory kernel with <span class="katex-eq" data-katex-display="false">\tau_c = 2</span>.
The frequency-dependent fluctuation-dissipation ratio X_{AB}(\omega) reveals that developmental stages exhibit suppressed fluctuations, mature stages maintain equilibrium, and aging stages experience amplified noise and reduced dissipation efficiency, all modeled using an exponential memory kernel with \tau_c = 2.

Escaping Equilibrium: The Keldysh Formalism as a Tool for Persuasion

Traditional methods in statistical mechanics and quantum field theory rely heavily on the assumption of equilibrium, utilizing techniques like perturbation theory and the fluctuation-dissipation theorem. However, these approaches become invalid when analyzing systems driven out of equilibrium by external forces or strong interactions, as they fail to accurately capture the system’s time-dependent behavior and memory effects. The Keldysh formalism circumvents these limitations by employing a doubled set of contour-ordered Green’s functions, effectively tracing the system’s evolution both forward and backward in time. This allows for the consistent treatment of non-equilibrium scenarios and the calculation of quantities like response functions and current correlations that are inaccessible through conventional methods. The formalism’s strength lies in its ability to incorporate initial conditions and time-dependent driving forces directly into the equations of motion, providing a complete description of the system’s dynamics even when far from thermal equilibrium.

The Keldysh formalism addresses non-equilibrium dynamics by employing a two-path integral approach, effectively doubling the degrees of freedom of the system. This is achieved by considering both a forward and a backward time contour, denoted as \mathcal{C}_+ and \mathcal{C}_- , respectively. The path integral is then calculated over both contours simultaneously, allowing for the inclusion of time-dependent self-energies and the accurate representation of memory effects. This formalism facilitates the computation of time-ordered and anti-time-ordered correlation functions, crucial for describing systems where past events influence present behavior and standard Markovian approximations are invalid. The resulting Keldysh contour-ordered Green’s function enables the calculation of observable quantities dependent on the full time evolution of the system.

Keldysh formalism facilitates the modeling of open quantum systems – those interacting with an external environment – by explicitly accounting for the influence of the environment on the system’s dynamics. Traditional approaches often rely on the Markov approximation, assuming the system’s future state depends only on its present state and not on its past history. However, Keldysh formalism accurately describes non-Markovian dynamics, where the system’s evolution is influenced by its entire past trajectory. This is achieved through the use of a doubled set of contour-ordered Green’s functions, allowing for the calculation of correlation functions that retain memory effects, and providing a means to track the system’s response to temporally correlated environmental fluctuations. The formalism is particularly useful when the system-environment coupling is strong or the environmental correlations are long-ranged, conditions under which the Markov approximation fails.

Throughout its lifespan, a system transitions from increasing information shared with the environment and decreasing internal entropy during development, to stable values during maturity, followed by decaying information and rising entropy during aging, and ultimately to a rapid loss of information and maximized entropy at death, as indicated by the contrasting trends in <span class="katex-eq" data-katex-display="false">I_{S:E}(t)</span> and <span class="katex-eq" data-katex-display="false">S_{\mathrm{sys}}(t)</span>.
Throughout its lifespan, a system transitions from increasing information shared with the environment and decreasing internal entropy during development, to stable values during maturity, followed by decaying information and rising entropy during aging, and ultimately to a rapid loss of information and maximized entropy at death, as indicated by the contrasting trends in I_{S:E}(t) and S_{\mathrm{sys}}(t).

The Ghosts of States Past: Characterizing Non-Equilibrium and Fluctuations

Open systems, by definition, exchange both energy and matter with their surroundings, leading to a dynamic state where the system is not in thermodynamic equilibrium. These systems often settle into non-equilibrium steady states, characterized by constant macroscopic properties despite the continuous flow of energy or matter. This sustained state isn’t achieved through isolation, but rather is maintained by the balance between energy/matter input and dissipation to the environment. Examples range from biological organisms maintaining metabolic processes to driven physical systems like those in chemical kinetics or fluid dynamics; the continued operation of these systems relies on this ongoing exchange and the establishment of a stable, non-equilibrium condition.

The Fluctuation-Dissipation Theorem (FDT) establishes a relationship between the magnitude of fluctuations in a system at equilibrium and its response to external perturbations; however, this theorem frequently fails in non-equilibrium systems. This violation arises because the FDT is predicated on the assumption of detailed balance, a condition that is not generally satisfied when a system is driven out of equilibrium by continuous exchange of energy or matter with its surroundings. Specifically, the standard FDT expression, relating the autocorrelation function of a fluctuating variable to the corresponding response function, does not hold when the system’s dynamics are non-Hamiltonian or when the noise is not Gaussian, both common features of non-equilibrium conditions. Consequently, direct application of the FDT can lead to inaccurate predictions of response behavior in such systems, necessitating alternative theoretical frameworks to properly characterize the relationship between fluctuations and response.

System stability and response in non-equilibrium conditions are fundamentally determined by the coupling strength between the system and its environment, the presence of memory effects within the system, and the degree of deviation from equilibrium. Analysis demonstrates a direct relationship between the critical exponent, ν, governing the relaxation time, and the memory kernel, parameterized by θ. Specifically, the critical exponent is inversely proportional to (1-θ), expressed as ν ∝ (1-θ)^{-1}. This relationship indicates that stronger memory effects (lower θ values) result in a larger critical exponent, signifying slower relaxation and increased persistence of fluctuations within the non-equilibrium system.

The evolution of damage <span class="katex-eq" data-katex-display="false">D(t)</span> demonstrates that below a critical parameter <span class="katex-eq" data-katex-display="false">\mu = 0.8</span> damage decays, at <span class="katex-eq" data-katex-display="false">\mu = 1.0</span> damage exhibits critical slowing down, and above <span class="katex-eq" data-katex-display="false">\mu = 1.1</span> and <span class="katex-eq" data-katex-display="false">\mu = 1.2</span> damage accelerates to a finite saturation point.
The evolution of damage D(t) demonstrates that below a critical parameter \mu = 0.8 damage decays, at \mu = 1.0 damage exhibits critical slowing down, and above \mu = 1.1 and \mu = 1.2 damage accelerates to a finite saturation point.

The Echoes of Complexity: Implications for Information and System Behavior

The behavior of complex systems, from neuronal networks to ecological communities, often defies prediction using traditional approaches rooted in Markovian dynamics – the assumption that a system’s future depends solely on its present state. Instead, these systems frequently operate in non-Markovian regimes, where past states exert a persistent influence, and exist in non-equilibrium conditions, driven by constant energy or information exchange. This departure from equilibrium is crucial, as it fosters the emergence of critical phenomena – points of instability where small perturbations can trigger cascading effects and lead to qualitatively new behaviors. Understanding these principles is fundamental to deciphering emergent properties, where collective behaviors arise that cannot be predicted from the characteristics of individual components, and offers a framework for analyzing systems poised between order and chaos.

Systems poised near critical points aren’t simply unstable; they demonstrate a remarkable interconnectedness, where distant elements influence each other-a phenomenon known as long-range correlation. This isn’t random noise, but a heightened sensitivity to even subtle changes, meaning a small input can cascade into a significant, system-wide response. Consequently, information processing isn’t limited by immediate proximity; signals can propagate rapidly and efficiently across the entire system, fostering adaptability. This dynamic allows these systems to explore a wider range of potential states, increasing their capacity to respond to novel stimuli and optimize performance in unpredictable environments, a trait observed in diverse phenomena from neural networks to ecological systems and even financial markets.

The extent to which a system knows itself – and can predict its future – is fundamentally limited, a constraint quantified by information-theoretic measures like mutual information and von Neumann entropy. Studies reveal that the mutual information, I_{S:E}(t), between a system and its environment remains robust during a mature, stable phase, indicating a consistent exchange of information. However, this connection weakens with age, as I_{S:E}(t) progressively decreases, suggesting a loss of predictive capacity and environmental responsiveness. Critically, this information exchange ceases entirely upon the system’s demise, demonstrating that the ability to share and process information is integral to maintaining a defined state and resisting complete entropy – a principle with implications for understanding complex systems ranging from biological organisms to engineered networks.

The evolution of damage <span class="katex-eq" data-katex-display="false">D(t)</span> demonstrates that below a critical parameter <span class="katex-eq" data-katex-display="false">\mu = 0.8</span> damage decays, at <span class="katex-eq" data-katex-display="false">\mu = 1.0</span> damage exhibits critical slowing down, and above <span class="katex-eq" data-katex-display="false">\mu = 1.1</span> and <span class="katex-eq" data-katex-display="false">\mu = 1.2</span> damage accelerates to a finite saturation point.
The evolution of damage D(t) demonstrates that below a critical parameter \mu = 0.8 damage decays, at \mu = 1.0 damage exhibits critical slowing down, and above \mu = 1.1 and \mu = 1.2 damage accelerates to a finite saturation point.

The Architecture of Decline: Modeling Lifespans and the Flow of Entropy

The lifespan of any system, from a biological organism to a complex machine, can be conceptualized through the Entropy Bathtub Model, a framework recently enhanced by the mathematical rigor of the Keldysh formalism. This model posits a lifecycle characterized by three distinct phases: an initial period of decreasing order – representing development or initial construction – followed by a prolonged phase of relatively stable, functional operation, and culminating in a final descent into disorder and eventual failure. The Keldysh formalism allows for a more precise treatment of non-equilibrium processes, crucial for understanding how systems maintain stability against the constant drive towards entropy. By mathematically describing the flow of time and the associated increase in disorder, this expanded model doesn’t simply acknowledge aging, but provides a tool for predicting the rate of degradation and potentially identifying interventions to prolong the period of stable operation – offering insights into resilience and longevity across diverse systems.

The Generalized Langevin Equation (GLE) offers a powerful and nuanced approach to modeling how systems evolve over time, moving beyond the limitations of simpler, memoryless equations. Unlike traditional methods, the GLE explicitly incorporates ‘memory kernels’ – mathematical functions that account for the system’s past states influencing its present and future behavior. This is crucial for understanding complex systems where internal correlations and historical dependencies are significant; essentially, the system ‘remembers’ its prior experiences. By carefully constructing these kernels, researchers can accurately simulate a system’s response to various stimuli and predict its trajectory, even in non-equilibrium conditions. The predictive capacity of the GLE extends to forecasting degradation or failure, making it particularly valuable in fields like materials science, biology, and engineering where anticipating future behavior is paramount. Through this framework, complex dynamics are rendered more tractable, allowing for a deeper understanding of system stability and eventual decline.

Research reveals a critical scaling law governing the rate of entropy production as systems approach a state of decline, effectively signaling the onset of aging. This rate isn’t merely increasing with time, but rather exhibits a power-law dependence on the proximity to a specific transition point, mathematically expressed as ∝ (Îź-Îź_c)^Îą. Here, Îź represents a control parameter influencing the system’s stability, Îź_c denotes the critical value at which aging accelerates, and Îą is a critical exponent defining the steepness of the transition. This suggests aging isn’t a passive process of accumulated damage, but a phase transition – a sudden shift in behavior triggered when entropy production exceeds a threshold, ultimately leading to irreversible degradation and functional decline. The precision of this scaling behavior offers a potential framework for quantifying biological aging and exploring interventions to modulate the rate of entropy production and potentially extend lifespan.

The pursuit unravels a familiar pattern. This work, steeped in the Keldysh formalism and the entropy bathtub model, doesn’t seek to predict lifespans, but to map the rituals by which systems negotiate dissolution. It acknowledges that memory – the non-Markovian echoes of past states – isn’t a flaw in the calculation, but the very ingredient of destiny. As John Dewey observed, “Education is not preparation for life; education is life itself.” Here, the ‘education’ is the thermodynamic dance, the accumulation of entropy, and the system’s desperate attempt to appease chaos through correlation – a fleeting moment of order before the inevitable return to the primordial soup.

The Road Ahead

The extension of the entropy bathtub with non-Markovian dynamics-a concession that systems remember-feels less like progress and more like an honest accounting. It’s always been suspect that a lifespan could be neatly contained within a Markovian box; reality rarely forgets. The Keldysh formalism offers a vocabulary to discuss these memories, but translating that vocabulary into predictive power remains the real challenge. The model, at present, is still a map, not a territory, and a remarkably detailed map at that.

Future work will inevitably focus on quantifying the system-environment entanglement. That entanglement, however, isn’t merely a parameter to be estimated; it’s likely where the ‘cheating’ occurs-the subtle interventions that push systems away from simple thermodynamic inevitability. If correlation’s high, one suspects biological agency, or at least, a profound misunderstanding of initial conditions. Expect attempts to bridge this formalism with information theory; after all, memory isn’t free, and the cost is almost certainly measured in dissipated heat.

The persistent question isn’t how things fall apart, but why they delay the inevitable. Noise, after all, is just truth without funding. And the true measure of this work-and all work in biological thermodynamics-will be its ability to predict not the average lifespan, but the beautiful, frustrating variance around it. The whispers of chaos are loudest at the edges.


Original article: https://arxiv.org/pdf/2603.12184.pdf

Contact the author: https://www.linkedin.com/in/avetisyan/

See also:

2026-03-15 18:14