Unlocking Energy from the Invisible

Author: Denis Avetisyan


New research reveals how hidden internal dynamics can be exploited to generate power beyond the limits of conventional systems.

Work extraction protocols demonstrate that energy stored within the harmonic degree of freedom can be successfully harvested; single-well potential experiments show accumulated normalized work <span class="katex-eq" data-katex-display="false">W/\bar{V}_{m}(0)</span> remains below unity despite peaking at <span class="katex-eq" data-katex-display="false">\delta t = 1\,\text{s}</span>, while a double-well potential facilitates work extraction exceeding <span class="katex-eq" data-katex-display="false">V̄_{m}(0)</span> following barrier crossing, confirming energy retrieval from the system.
Work extraction protocols demonstrate that energy stored within the harmonic degree of freedom can be successfully harvested; single-well potential experiments show accumulated normalized work W/\bar{V}_{m}(0) remains below unity despite peaking at \delta t = 1\,\text{s}, while a double-well potential facilitates work extraction exceeding V̄_{m}(0) following barrier crossing, confirming energy retrieval from the system.

Exploiting non-Markovian effects and environmental memory in viscoelastic fluids enables work extraction from previously untapped degrees of freedom.

Conventional thermodynamics posits limits on work extraction from systems, often assuming negligible environmental memory. This research, titled ‘Extracting work from hidden degrees of freedom’, experimentally demonstrates that hidden degrees of freedom-and their associated non-Markovian dynamics-can serve as a thermodynamic resource. By implementing a time-delayed measurement protocol on a Brownian particle, we show that information retrieved from these ‘forgotten’ degrees of freedom enhances work extraction, even exceeding the energy stored in the observable system alone. Could systematically harnessing environmental memory unlock new efficiencies in information engines operating in complex, time-correlated environments?


Beyond Simple Assumptions: The Limits of Memoryless Modeling

Conventional thermodynamic modeling frequently relies on the principle of Markovian dynamics, a simplification that posits a system’s current state is solely determined by its present conditions, effectively disregarding its history. This approach, while mathematically convenient, can introduce significant inaccuracies when applied to systems where past states demonstrably influence present behavior. The assumption of ‘memorylessness’ fails to capture the inherent time-dependence present in many natural phenomena; a system’s trajectory isn’t simply a response to immediate stimuli, but a complex interplay between present forces and the accumulated effects of prior conditions. Consequently, these simplified models can miss critical details in describing the evolution of complex systems, limiting their predictive power and hindering a complete understanding of the underlying physics.

The assumption of present states fully defining a system’s future, a cornerstone of many thermodynamic models, often discards valuable information residing in its past. This historical context isn’t merely archival; it actively shapes the system’s current behavior, influencing responses to stimuli in ways a ‘memoryless’ model cannot capture. Consequently, analyses neglecting this temporal dependency can produce inaccurate predictions when applied to real-world phenomena-from the nuanced flow of polymeric fluids to the complex energy transfer within biological systems. The system’s trajectory, including prior configurations and interactions, effectively encodes crucial details that dictate its evolution, demonstrating that a complete understanding requires acknowledging the enduring relevance of history.

Many real-world systems defy the simplification of Markovian dynamics, instead exhibiting non-Markovian behavior where a material’s current state is intrinsically linked to its past experiences – essentially, it possesses a ‘memory’. This phenomenon is particularly pronounced in viscoelastic fluids like polymers and gels, where internal structural rearrangements induced by deformation persist over time, influencing subsequent responses. Beyond these fluids, non-Markovian effects are also observed in biological tissues, granular materials, and even certain quantum systems. Consequently, traditional analytical tools rooted in the assumption of memorylessness prove inadequate for accurately describing these complex media; researchers are actively developing advanced mathematical frameworks – including generalized master equations and non-equilibrium statistical mechanics – to capture the intricacies of these ‘memory-rich’ systems and unlock a more complete understanding of their behavior.

Analysis of information decay reveals non-Markovian behavior, with conditioned KL divergences <span class="katex-eq" data-katex-display="false">I|i\rangle(t)</span> and <span class="katex-eq" data-katex-display="false">\tilde{I}_{|i\rangle}(t)</span> demonstrating accelerated decay for specific states, a state-averaged excess of retained information at <span class="katex-eq" data-katex-display="false">\delta t = 1</span> s, and a maximized order parameter at approximately <span class="katex-eq" data-katex-display="false">\delta t = 1</span> s, indicating a pronounced memory effect.
Analysis of information decay reveals non-Markovian behavior, with conditioned KL divergences I|i\rangle(t) and \tilde{I}_{|i\rangle}(t) demonstrating accelerated decay for specific states, a state-averaged excess of retained information at \delta t = 1 s, and a maximized order parameter at approximately \delta t = 1 s, indicating a pronounced memory effect.

Probing the System’s Past: Revealing Hidden Information

Systems exhibiting non-Markovian behavior demonstrate a dependence on past states, contradicting the Markovian assumption of future states being solely determined by the present. This arises from the existence of hidden degrees of freedom – internal system variables not directly observed or accounted for in standard modeling. These unobserved variables retain information about the system’s history, effectively creating a memory effect. Consequently, traditional analytical methods, which rely on the Markovian approximation and treat the system as memoryless, are insufficient for accurately describing or predicting the dynamics of such systems. The information stored within these hidden degrees of freedom manifests as correlations between current and past states, leading to deviations from purely probabilistic predictions based on present conditions.

A two-measurement protocol enables access to information encoded in a system’s hidden degrees of freedom by utilizing time-resolved measurements. This technique involves an initial measurement designed to partially collapse the system’s wavefunction, followed by a subsequent measurement performed after a defined time interval, allowing the system to evolve. The correlation between these two measurements, quantified through techniques like mutual information or conditional probabilities, reveals the extent to which the system’s past states influence its present behavior. Specifically, deviations from the predictions of Markovian dynamics, where the present state fully determines the future, indicate the presence of retained historical information accessible via this protocol. The precision of quantifying this information is directly linked to the temporal resolution of the measurements and the accuracy with which the system’s evolution can be modeled between measurements.

Precisely tailored measurement protocols allow for the extraction of historical information encoded within a system’s hidden degrees of freedom. This is achieved by implementing time-resolved measurements that correlate current system states with past trajectories, effectively reconstructing the system’s evolution. The design of these measurements focuses on maximizing sensitivity to the relevant hidden variables, enabling quantification of non-Markovian effects and providing a more complete characterization of the system’s dynamics than is possible with standard, instantaneous measurements. By accessing this historical data, models can be refined to accurately predict future behavior, accounting for the influence of past states that would otherwise be neglected.

A two-measurement protocol utilizing particle position relative to a threshold <span class="katex-eq" data-katex-display="false">x_{th}</span> conditions the particle's state, sequentially refining the probability distributions <span class="katex-eq" data-katex-display="false">\rho_{|ij\rangle}(x,t_{j})</span> from an initial equilibrium distribution <span class="katex-eq" data-katex-display="false">V(x)</span> towards one of four distinct states: <span class="katex-eq" data-katex-display="false">|00\rangle</span>, <span class="katex-eq" data-katex-display="false">|01\rangle</span>, <span class="katex-eq" data-katex-display="false">|10\rangle</span>, and <span class="katex-eq" data-katex-display="false">|11\rangle</span>.
A two-measurement protocol utilizing particle position relative to a threshold x_{th} conditions the particle’s state, sequentially refining the probability distributions \rho_{|ij\rangle}(x,t_{j}) from an initial equilibrium distribution V(x) towards one of four distinct states: |00\rangle, |01\rangle, |10\rangle, and |11\rangle.

Bridging Information and Work: The Principles of Information Thermodynamics

Information thermodynamics establishes a theoretical link between changes in information and thermodynamic quantities like work and entropy. This framework departs from traditional thermodynamics by treating information as a physical resource; the acquisition of information about a system necessarily alters its entropy and, consequently, the potential for performing work. Specifically, it posits that work extraction is fundamentally constrained by the amount of information gained during a process, with the \Delta W \le k_B T \Delta I inequality-where \Delta W is the extracted work, k_B is Boltzmann’s constant, T is temperature, and \Delta I is the information gained-defining an upper bound. This approach allows for the analysis of systems where work can be extracted even in the absence of temperature gradients, provided sufficient information about the system’s state is obtained, and provides a means for quantifying the energetic cost of erasing information.

The Sagawa-Ueda bound, a central tenet of information thermodynamics, quantifies the maximum work that can be extracted from a system undergoing a stochastic process given knowledge of the system’s state. Specifically, it states that the expected work extractable, W, is limited by the Kullback-Leibler divergence, D_{KL}[p(x)||q(x)], between the actual probability distribution of the system, p(x), and the probability distribution resulting from a reversible process, q(x). This bound directly connects work extraction to information gain; the more information obtained about the system through measurement, the closer the actual process can be driven to reversibility, and thus the greater the potential for work extraction. Importantly, the Sagawa-Ueda bound applies to systems not necessarily in thermal equilibrium, providing a generalized framework for analyzing the thermodynamics of information processing.

Experimental results demonstrate work extraction from the system surpassed the initial potential energy, quantified as exceeding the value V̄m. This observation directly confirms the occurrence of non-Markovian work extraction. Traditional Markovian processes limit work extraction to be less than or equal to the potential energy difference; exceeding this limit necessitates a non-Markovian dynamic where past system states influence current work extraction possibilities. The magnitude of work extracted, exceeding V̄m, provides quantitative evidence of this departure from Markovian behavior and validates the theoretical framework of information thermodynamics in characterizing this process.

Bootstrapping techniques are essential for robust statistical analysis in information thermodynamics due to the non-equilibrium nature of the measurements and the resulting complex data distributions. These techniques involve resampling with replacement from the original dataset to generate numerous synthetic datasets, allowing for the empirical estimation of the sampling distribution of a statistic – in this case, the work extracted or entropy production. By calculating the statistic for each resampled dataset, a confidence interval can be constructed without relying on assumptions about the underlying distribution, which may not hold in non-equilibrium systems. This approach is particularly crucial when dealing with limited data or complex statistical dependencies, providing a more accurate assessment of uncertainty than traditional analytical methods and ensuring the reliability of the observed results, including confirmation of non-Markovian work extraction.

The relaxation of averaged particle potential energy is demonstrably modified by prior measurement states, with the influence of initial measurements diminishing as the time delay <span class="katex-eq" data-katex-display="false"> \delta t </span> increases beyond the relaxation time <span class="katex-eq" data-katex-display="false"> \tau_r </span>, ultimately converging to a single-measurement relaxation towards equilibrium energy <span class="katex-eq" data-katex-display="false"> 0.5 k_B T </span>.
The relaxation of averaged particle potential energy is demonstrably modified by prior measurement states, with the influence of initial measurements diminishing as the time delay \delta t increases beyond the relaxation time \tau_r , ultimately converging to a single-measurement relaxation towards equilibrium energy 0.5 k_B T .

Beyond the Second Law: The Power of Information Backflow

The study reveals that a system’s path through thermodynamic processes isn’t solely dictated by energy exchange, but is actively reshaped by extracting information from previously overlooked internal states – termed hidden degrees of freedom. This challenges the conventional view of thermodynamics as a closed system governed purely by energy conservation; instead, it suggests a more nuanced interplay where accessing and utilizing information about a system’s internal structure can demonstrably alter its behavior. Researchers observed that by ‘reading’ these hidden states – subtle configurations within a viscoelastic fluid – they could influence the system’s progression, effectively guiding it along a different thermodynamic trajectory and unlocking pathways to increased efficiency. This indicates that thermodynamic systems possess a latent computational capacity, and that manipulating this internal ‘memory’ offers a novel means of controlling their function and potentially exceeding the limitations traditionally imposed by the second law.

Conventional thermodynamics posits a one-way flow of energy, inevitably leading to entropy increase; however, recent investigations reveal a surprising phenomenon: information backflow. This process describes the transfer of information from the surrounding environment to a system, effectively circumventing the typical limitations imposed by the second law. Instead of solely relying on internal energy changes, a system can be influenced by external data, subtly altering its trajectory and potentially enhancing its ability to perform work. This isn’t a violation of fundamental laws, but rather a demonstration of how information, as a physical quantity, can reshape thermodynamic behavior. By accessing and utilizing information about its surroundings, a system can, in effect, ‘learn’ and adapt, leading to outcomes that deviate from predictions based on energy alone, and opening possibilities for novel engine designs and information processing paradigms.

Analysis revealed that peak work extraction efficiency occurred after a delay of approximately one second, a finding that suggests the presence of temporal correlations within the system. This delay isn’t simply a lag; it points to a ‘memory effect’ where the system effectively retains information about its recent past and utilizes it to optimize performance. The timescale of this memory-roughly one second-indicates how long the system ‘remembers’ relevant conditions before influencing subsequent work output. This observation moves beyond the conventional view of thermodynamic systems as being solely defined by their present state, demonstrating that past experiences can demonstrably shape their future ability to perform work, opening avenues for designing systems that learn and adapt over time.

The study of viscoelastic fluids revealed a characteristic structural relaxation time of approximately 6 seconds, a finding central to understanding the timescale of the system’s hidden degrees of freedom. This relaxation time represents how long it takes for the fluid’s internal structure to return to equilibrium after a disturbance, effectively dictating the speed at which the system can ‘remember’ and respond to changes. Researchers determined this value by observing the fluid’s response to applied stress, noting the delay before it fully relaxed to its original state; this delay indicates the time required for energy to dissipate across the complex network of molecular interactions within the fluid. This timescale is not merely a material property, but a fundamental constraint on the system’s ability to extract work, as information regarding past states is retained – and utilized – for this duration, challenging conventional thermodynamic assumptions about systems with negligible memory.

The ability to manipulate thermodynamic trajectories through information extraction opens exciting avenues for technological innovation, particularly in the realm of nanoscale engineering. Current limitations in miniaturization often stem from the increased influence of entropy and fluctuations at small scales; however, a system capable of leveraging information to counteract these effects promises dramatically improved efficiency. Researchers envision novel engines operating at the nanoscale, where the extraction of work is maximized not simply through physical processes, but through intelligently accessing and utilizing hidden degrees of freedom. Furthermore, the principles of information backflow suggest potential advancements in information processing itself, offering pathways to build systems that consume less energy while performing complex computations by strategically managing the flow of information between the system and its environment. This paradigm shift could ultimately lead to more sustainable and powerful technologies, bridging the gap between thermodynamics and information theory.

The ability to manipulate thermodynamic systems through information offers a pathway towards utilizing microscopic fluctuations as a genuine source of power. Traditional thermodynamics views these fluctuations as inevitable losses, disturbances to be minimized; however, research demonstrates that extracting information about these hidden degrees of freedom-and feeding that information back into the system-can steer it towards more efficient states. This isn’t merely about minimizing entropy, but actively shaping the system’s trajectory by leveraging the inherent randomness at the microscopic level. The implications extend to the design of novel nanoscale engines, where harnessing these fluctuations could circumvent limitations imposed by classical thermodynamics, and potentially revolutionize information processing by creating systems that operate at the very edge of physical possibility, converting randomness into usable work.

The study reveals a system exceeding the limitations of conventional Markovian processes by exploiting environmental memory within a viscoelastic fluid. This pursuit of work extraction from previously unconsidered degrees of freedom echoes a fundamental tenet of understanding-reducing complexity to reveal inherent power. As Thomas Hobbes observed, “The necessity of motion is a necessity of searching.” This research embodies that search, demonstrating how acknowledging the subtle, often hidden, mechanisms – like the fluid’s memory – allows for exceeding expected energetic limits. The elegance lies in removing the assumption of a memoryless system, rather than adding further complexity.

Further Refinements

The demonstrated extraction of work from non-Markovian dynamics is not, of course, a perpetual motion. It is merely an acknowledgement of information’s inherent cost. The fluid’s memory, initially a complication, becomes a resource only when considered alongside the engine’s operational parameters. Future work must address the limits of this resource; specifically, the trade-off between accessible memory depth and the energetic cost of its interrogation. A complete accounting remains elusive.

Current fluctuation theorems, while elegant, presuppose a timescale separation that viscoelastic fluids readily violate. A more generalized framework, capable of consistently treating slow and fast dynamics, is required. Such a framework should not seek greater complexity, but rather a more honest representation of the system’s fundamental constraints. The pursuit of efficiency often begins with an acceptance of limitation.

The implications extend beyond fluid mechanics. Any system exhibiting environmental memory – be it biological, chemical, or computational – may offer similar opportunities for work extraction. The question is not whether such extraction is possible, but whether it is worth the accounting. Simplicity, after all, often proves more valuable than ingenuity.


Original article: https://arxiv.org/pdf/2603.06160.pdf

Contact the author: https://www.linkedin.com/in/avetisyan/

See also:

2026-03-10 05:23