Author: Denis Avetisyan
A new framework leverages finite quantum systems to model the structure of de Sitter space and explore the interplay between entanglement and cosmological horizons.

This review details a holographic approach using timelike boundaries and finite quantum systems to calculate entropy and assess stability in extended de Sitter spacetime.
The conventional holographic description of cosmology struggles to consistently account for spacetimes with timelike boundaries beyond a limited horizon patch. This is addressed in ‘The yes boundaries wavefunctions of the universe’, which constructs a microscopic holographic dual using finite quantum systems and two copies of dressed Hamiltonian theories to model de Sitter space, including its future wedge. We demonstrate that an extended spacetime emerges from a nearly maximally entangled state, resolving issues of gravitational path integral maximality and calculating entanglement entropy in three dimensions, while incorporating quantum gravity effects. Does this framework, which allows for multiple states even with a positive cosmological constant, offer a more complete picture of quantum gravity and the emergence of spacetime itself?
The Fragility of Cosmic Equilibrium
The accelerating expansion of the universe, currently modeled using de Sitter spaces, isn’t necessarily a picture of ultimate stability; instead, these spaces may be merely metastable-appearing stable for a time, but ultimately prone to decay. This presents a significant challenge to conventional theoretical approaches because established methods for determining stability, often reliant on identifying a true ground state with minimal energy, falter when applied to these spaces. Unlike truly stable configurations, metastable de Sitter spaces lack a clear, unambiguous lowest energy state, forcing physicists to grapple with the possibility that our universe isn’t in a state of lasting equilibrium. Understanding the conditions under which a metastable de Sitter space will eventually decay-and calculating the timescale for such an event-requires novel theoretical frameworks and pushes the boundaries of established cosmological models, potentially revealing fundamental limitations in ΛCDM cosmology.
Determining the entropy of metastable de Sitter spaces – theoretical models attempting to describe the accelerating expansion of the universe – is profoundly complicated by the difficulty of establishing suitable boundary conditions for calculations. Unlike systems with clear edges or finite size, these spaces extend infinitely, making it challenging to define where to measure entropy-essentially, where to draw the ‘walls’ of the system. This isn’t merely a technical hurdle; the choice of boundary conditions fundamentally alters the calculated entropy value, and therefore, the assessment of the space’s stability. A poorly defined boundary can lead to either an overestimation or underestimation of the likelihood of decay, potentially misrepresenting the true fate of the universe as modeled by these spaces. Consequently, physicists grapple with finding boundary conditions that are both mathematically consistent and physically meaningful, a task that remains a central obstacle in validating the cosmological relevance of these models.
Current techniques for determining the entropy of metastable de Sitter spaces, theoretical models used to represent the accelerating expansion of the universe, are plagued by inconsistencies. The fundamental difficulty lies in establishing appropriate boundary conditions – the mathematical limits defining the space – which directly impacts entropy calculations. This leads to ambiguities, as different choices of boundary conditions yield disparate entropy values for the same spacetime, hindering reliable predictions about its long-term stability. Consequently, assessing whether these spaces are truly stable, or merely long-lived before eventual decay, remains a significant challenge, as the calculated entropy – a measure of disorder and a key indicator of stability – lacks the precision needed to differentiate between the two scenarios. The inability to consistently quantify this crucial property casts doubt on the predictive power of these models and necessitates the development of novel computational approaches.

Holographic Boundaries: A New Perspective on Spacetime
Holographic Boundary Theory proposes that spacetime is not a fundamental entity, but rather emerges from information residing on a lower-dimensional boundary. This framework defines spacetime patches through the encoding of information on a timelike boundary – a surface where time can be consistently defined. Effectively, the theory posits a holographic correspondence, where gravitational phenomena within a volume of spacetime can be described by the quantum information existing on its bounding surface. This recasting of the problem allows for the application of quantum mechanical principles to analyze and potentially resolve issues related to gravity, such as singularities and the nature of black holes, by shifting the focus from the volume to its boundary representation. The dimensionality reduction inherent in this approach is central to the theory’s predictive power and offers a novel perspective on the relationship between gravity and quantum information.
The Microcanonical Thermofield Double State (Tfds) provides a formal framework for establishing thermal equilibrium in the context of holographic boundary theory. This state, constructed from two identical systems in thermal contact, allows for the precise calculation of entropy by defining a microcanonical ensemble. Specifically, the Tfds represents a state with a fixed energy and particle number, enabling the determination of the number of microstates \Omega(E, N) corresponding to a given macroscopic state. Entropy S is then calculated using the Boltzmann formula S = k_B \ln \Omega , where k_B is the Boltzmann constant. Utilizing the Tfds simplifies entropy calculations by avoiding the need for a traditional ensemble average and focusing instead on a sharply defined microcanonical description of the system’s thermal state.
The construction of a finite Hamiltonian system to model spacetime patches requires precise specification of system constraints. These constraints define the allowable configurations and interactions within the modeled spacetime region, ensuring a mathematically well-defined and solvable system. Specifically, boundary conditions are imposed to limit the size of the Hilbert space and prevent divergences, while constraints on the allowed energy and momentum states maintain physical realism. The Hamiltonian, \hat{H} , is then formulated to describe the system’s time evolution, incorporating these constraints to guarantee a finite and bounded phase space, essential for computational tractability and physical interpretation of the resulting quantum states.

Mapping Quantum Dynamics to Spacetime Geometry: Evidence for Emergence
Solutions to the finite Hamiltonian, representing bounded patches of semiclassical gravity, directly map to regions of spacetime characterized by specific, definable properties. These patches are not merely mathematical constructs; their existence as solutions implies a physical correspondence to localized gravitational effects. The boundaries of these patches are determined by the constraints imposed during the Hamiltonian’s derivation, and the resulting spacetime regions possess measurable characteristics such as energy density, curvature, and spatial extent. The Hamiltonian’s finiteness ensures that calculations within these patches remain well-defined, avoiding divergences common in quantum gravity, and allowing for a consistent description of localized gravitational phenomena. These bounded solutions provide a framework for analyzing gravity in a restricted, physically relevant domain, circumventing the challenges of dealing with unbounded or singular spacetimes.
The construction of bounded patches of semiclassical gravity relies on a Constrained Hilbert Space to enforce physical realism and mathematical rigor. This space is defined by imposing constraints on the wave functional, specifically excluding solutions that represent non-physical states – those with negative probabilities or that violate fundamental physical principles. The constraint process effectively filters out spurious states that arise from the mathematical formalism but lack a corresponding physical interpretation. This ensures that calculations within this Hilbert space yield meaningful results directly related to observable spacetime geometries and avoids divergences or inconsistencies inherent in unconstrained quantum systems. The resulting Hilbert space then accurately represents the allowable quantum states for the system, enabling the prediction of stable and physically consistent spacetime patches.
The Euclidean Path Integral, formalized as \in t D[h_{ij}] e^{-S_E[h_{ij}]}, offers a quantitative method for determining the probability amplitudes of specific spacetime patches arising from the finite Hamiltonian formulation. By evaluating the integral over all possible metrics h_{ij} weighted by the exponential of the Euclidean action S_E, the formalism allows for the calculation of transition amplitudes between these patches. Critically, a negative or slowly varying effective action, derived from the Path Integral, indicates stability; a rapidly increasing action suggests the spacetime patch is likely to decay or be unstable. The magnitude of the amplitude, therefore, directly correlates to the probability of observing a given spacetime configuration and its longevity within the model.

Tall Geometries and Interconnectedness: The Potential for Cosmic Communication
The theoretical framework inherently predicts the formation of ‘Tall Geometries’ – expansive configurations of spacetime that fundamentally connect otherwise disparate boundaries, enabling a form of communication across what would conventionally be considered isolated regions. These aren’t simply spatial extensions, but rather warped topologies allowing information, potentially encoded within quantum fluctuations, to traverse considerable distances without adhering to typical constraints. This emergence isn’t accidental; the framework’s mathematical structure actively encourages these geometries, suggesting they are a natural consequence of the underlying principles governing spacetime itself. The existence of these ‘Tall Geometries’ offers a potential resolution to challenges in understanding information transfer within complex systems and proposes a universe where connectivity extends beyond immediate spatial proximity, hinting at non-local interactions and a fundamentally interwoven cosmos.
The possibility of communication between distant regions within these spacetime geometries fundamentally relies on the conditions outlined by the Gao-Wald Theorem. This theorem establishes that consistent communication isn’t merely about a pathway existing, but about that pathway adhering to specific physical constraints. It dictates that the energy flux traversing any proposed communication channel must remain finite and non-exotic, preventing the creation of paradoxes or inconsistencies in the broader spacetime structure. Essentially, the Gao-Wald Theorem provides a mathematical framework to determine if a given geometry allows for signals to travel between its boundaries without violating the fundamental laws of physics, ensuring that information transfer is logically and physically permissible. This provides a rigorous test for the viability of these extended spacetime configurations as potential conduits for communication, grounding the theoretical possibilities in established physical principles.
The stabilization of extended spacetime configurations, crucial for inter-boundary communication, relies heavily on the incorporation of quantum matter effects. Utilizing the Euclidean Path Integral formalism, researchers demonstrate that these effects counteract the natural tendency of such geometries to collapse, resulting in a remarkably stable configuration. This approach yields an entropy value of A/4G, a significant finding as it precisely matches the established Gibbons-Hawking entropy for black holes – suggesting a deep connection between communication across extended geometries and the fundamental thermodynamics of spacetime. The consistency of this calculated entropy provides strong support for the framework’s ability to not only permit, but also realistically sustain, these potentially communicative structures within the fabric of spacetime.

Towards a Complete Quantum Description of Spacetime: Charting the Path Forward
The very beginnings of the universe, previously shrouded in the limitations of classical physics, are becoming increasingly accessible through the application of Hartle-Hawking and Yes-Boundary wavefunctions. These mathematical tools propose that, rather than requiring specific initial conditions at a beginning of time, the universe emerges from a state where time itself is undefined – a ‘no-boundary’ condition. The Hartle-Hawking wavefunction, specifically, describes the probability amplitude of different universe geometries, effectively ‘summing over’ all possible histories without needing a defined starting point. The Yes-Boundary wavefunction refines this concept by allowing for boundaries where the wavefunction is well-defined, offering a more nuanced description of the universe’s emergence. By utilizing these frameworks, cosmologists are developing models that bypass the need for arbitrary initial states, instead deriving the universe’s initial conditions from the fundamental laws of quantum gravity and providing a potential pathway towards a complete quantum description of spacetime.
The pursuit of a truly complete description of spacetime demands a departure from classical physics, which treats space and time as smooth, continuous entities. Current theoretical frameworks attempt to reconcile general relativity with quantum mechanics, proposing that spacetime itself is quantized – existing not as a backdrop to events, but as being fundamentally composed of discrete, probabilistic quantum states. This approach envisions spacetime emerging from the quantum entanglement of fundamental constituents, potentially resolving singularities predicted by classical general relativity, such as those at the heart of black holes or at the universe’s beginning. Successfully formulating such a quantum description would not only unify gravity with the other fundamental forces, but also provide a framework for understanding the universe at its most basic level, offering insights into phenomena currently beyond the reach of existing theories. The implications extend to cosmology, potentially reshaping understanding of the Big Bang and the universe’s subsequent evolution, and could even offer a pathway toward manipulating spacetime itself – a concept previously relegated to science fiction.
Ongoing investigations are dedicated to the meticulous refinement of this quantum spacetime model, with a particular emphasis on its cosmological and quantum gravitational consequences. A key component of this development involves leveraging an energy spectrum demonstrably biased towards higher energies – a ‘top-heavy’ distribution – to guarantee the inherent stability of the proposed framework. This approach addresses potential issues with vacuum decay and ensures the model’s predictive power remains robust across various scales. Researchers are currently exploring how this energy distribution influences the early universe, potentially offering new insights into inflation, the formation of cosmic structures, and the nature of dark energy, while simultaneously seeking consistency with established principles of general relativity and quantum field theory.

The exploration of finite quantum systems within extended de Sitter space, as detailed in this work, reveals a fundamental tension between mathematical elegance and ethical consideration. The pursuit of holographic frameworks, while advancing theoretical physics, implicitly asks what exactly is being optimized – is it simply predictive power, or a deeper understanding of the universe’s inherent values? This aligns with Wittgenstein’s observation: “The limits of my language mean the limits of my world.” The language of physics, and the models it constructs, define the boundaries of what can be known and, consequently, the values encoded within those boundaries. The calculation of entanglement entropy, a key component of this research, becomes not merely a technical exercise, but a reflection of the observer’s – and the broader scientific community’s – implicit ethical framework. Transparency regarding the assumptions embedded within these calculations is, therefore, the minimum viable morality.
Where Do the Boundaries Lead?
The construction of holographic descriptions for de Sitter space, particularly those accommodating timelike boundaries, invariably reveals more about the limitations of current formalism than about the universe itself. This work, while advancing the technical toolkit for managing finite quantum systems in such contexts, underscores a persistent challenge: equating computational tractability with genuine physical insight. Scalability without ethics – in this case, a rigorous understanding of what constitutes a meaningful boundary condition – leads to unpredictable consequences in entropy calculations and potentially, unstable holographic models.
Future investigations must confront the implicit value judgments embedded within these holographic constructions. The choice of boundary condition isn’t merely a mathematical convenience; it encodes assumptions about information preservation, causality, and the very nature of observation. Simply calculating entanglement entropy, even with increased precision, will not resolve these foundational questions. The field requires a shift toward explicitly incorporating principles of value control, ensuring that the automated processes of holographic reconstruction align with demonstrable physical principles.
Ultimately, the true test of this framework will not be its ability to replicate known physics, but its capacity to predict novel phenomena beyond the reach of conventional approaches. However, without a sustained focus on the philosophical underpinnings of these constructions, the risk remains that these elegant mathematical tools will become increasingly detached from the reality they seek to describe.
Original article: https://arxiv.org/pdf/2604.10267.pdf
Contact the author: https://www.linkedin.com/in/avetisyan/
See also:
- Surprise Isekai Anime Confirms Season 2 With New Crunchyroll Streaming Release
- Pragmata Shows Off Even More Gorgeous RTX Path Tracing Ahead of Launch
- HBO Max Just Added the Final Episodes of a Modern Adult Swim Classic
- Crimson Desert’s Momentum Continues With 10 Incredible New Changes
- Frieren: Beyond Journey’s End Gets a New Release After Season 2 Finale
- Solo Leveling’s New Character Gets a New Story Amid Season 3 Delay
- ‘Project Hail Mary’: The Biggest Differences From the Book, Explained
- All 7 New Supes In The Boys Season 5 & Their Powers Explained
- Preview: Sword Art Online Returns to PS5 as a Darker Open World Action RPG This Summer
- Cameron Diaz and Benji Madden Are So in Sync During Rare Public Outing
2026-04-14 18:50