The Unraveling of Cause and Effect

Author: Denis Avetisyan


A new look at the foundations of physics suggests causality isn’t a universal law, but a consequence of the universe’s structure and its march toward disorder.

This review examines how our understanding of causality has evolved from classical mechanics to modern relativity and quantum mechanics, arguing for its emergent nature in macroscopic systems.

Despite its intuitive grounding in everyday experience, causality remains a surprisingly subtle concept within the foundations of physics. This paper, ‘Causality in Physics: From Galileo to Einstein, and Beyond’, traces the historical evolution of this principle-and its frequent recession-through successive physical frameworks, from classical mechanics to modern relativity and quantum theories. We argue that causality is less a fundamental law and more an emergent property, arising from the mathematical structure of spacetime, thermodynamic irreversibility, and statistical behaviors at macroscopic scales. But if causality isn’t built in to the universe, what determines the perceived arrow of time and the predictability of physical systems?


The Fragile Foundation: Defining Cause in a Dynamic Universe

The notion of cause and effect, while fundamental to human understanding, presents a surprisingly complex challenge when subjected to the precision of physics. Establishing a definitive link between an event and its consequence demands more than simple observation; it requires a framework to differentiate genuine causal relationships from mere correlations. A true cause must precede its effect, and crucially, altering the cause must consistently result in a corresponding change in the effect – a principle easily stated, yet difficult to prove definitively, particularly when considering complex systems with numerous interacting variables. This pursuit of rigorous definition isn’t merely philosophical; it underpins the very foundations of predictive modeling and our ability to manipulate the physical world, forcing physicists to move beyond intuitive assumptions and develop quantifiable metrics for establishing causality.

Newtonian dynamics fundamentally reshaped understanding of cause and effect by establishing a deterministic framework for motion. Prior to this, explaining why objects moved – or stopped moving – relied heavily on attributing agency or invoking abstract forces. Isaac Newton, building on the work of predecessors like Galileo, demonstrated that motion isn’t an inherent property but a response to external forces. This is encapsulated in his laws, notably the first law, or the principle of inertia, which states that an object at rest stays at rest, and an object in motion stays in motion with the same speed and in the same direction unless acted upon by a force. F = ma – Newton’s second law – mathematically defines this relationship, asserting that force is directly proportional to mass and acceleration. Consequently, if one knows all the forces acting on an object, its future motion is, in principle, entirely predictable, solidifying a causal view where effects invariably follow from preceding forces and initial conditions.

Galileo’s Two New Sciences didn’t explicitly define inertia as a principle, but its careful analyses of motion fundamentally established the conditions necessary for causal reasoning about forces and movement. Through thought experiments involving inclined planes and falling bodies, Galileo demonstrated that an object in motion remains in motion – barring external influences – rather than requiring a constant force to sustain it. This observation, implicitly challenging Aristotelian physics, shifted the focus from maintaining motion to explaining changes in motion. By isolating and quantifying the effects of gravity and resistance, Galileo revealed that forces are not inherent in objects to keep them moving, but rather are external agents responsible for altering their state of motion, thereby creating a framework where cause – the force – could be linked directly to effect – the change in velocity. This paved the way for Newton’s formalization of inertia and a deterministic view of causality, where predictable outcomes arise from defined forces acting upon bodies.

Spacetime’s Embrace: Redefining Causal Boundaries

Special Relativity, formalized in 1905, fundamentally altered the classical Newtonian concepts of space and time by positing that they are not absolute but are instead interwoven into a single four-dimensional continuum known as spacetime. This framework is based on two primary postulates: the laws of physics are invariant in all inertial frames of reference, and the speed of light in a vacuum, denoted as c (approximately 299,792,458 meters per second), is constant for all observers, regardless of the motion of the light source. Consequently, measurements of both space and time are relative to the observer’s frame of reference, leading to phenomena such as time dilation and length contraction as an object’s velocity approaches c. The introduction of a universal speed limit, c, has profound implications for causality and the structure of the universe, as it dictates the maximum rate at which information and influence can propagate.

The light cone is a geometrical representation defining the boundaries of causal influence for a specific event in spacetime. An event can only be influenced by events within its future light cone – the region of spacetime reachable by signals traveling at or below the speed of light from the initial event. Conversely, the event can only be influenced by events within its past light cone – those events that could have sent signals to it at or below the speed of light. Events outside the light cone are considered causally disconnected, meaning no information or influence can be exchanged between them and the central event; this is a direct consequence of the universal speed limit imposed by the theory of special relativity. The light cone therefore provides a visual and mathematical framework for determining which events are causally related and which are not, fundamentally reshaping our understanding of cause and effect in the universe.

General Relativity builds upon the spacetime framework of Special Relativity by describing gravity not as a force, but as a manifestation of the curvature of spacetime caused by mass and energy. This curvature directly impacts the causal structure of the universe; the presence of mass-energy alters the paths of light and massive particles, effectively changing which events can influence others. Consequently, the light cone at any given spacetime point is no longer necessarily aligned with the coordinate axes, and its shape is distorted by the gravitational field. This means that the boundaries defining past, present, and future become relative to the observer’s location within the curved spacetime, and phenomena like gravitational time dilation and gravitational lensing arise as direct consequences of this altered causal structure. R_{\mu\nu} - \frac{1}{2}g_{\mu\nu}R + \Lambda g_{\mu\nu} = \frac{8\pi G}{c^4}T_{\mu\nu}

The Quantum Riddle: Probability and the Erosion of Determinism

Classical physics operates on the principle of determinism, where a system’s future state is fully determined by its initial conditions and the forces acting upon it. However, quantum mechanics fundamentally diverges from this model by introducing inherent probabilistic behavior. Unlike classical mechanics which predicts definite outcomes, quantum mechanics provides only the probability of observing a particular outcome. This is not due to a lack of information, but rather an intrinsic property of quantum systems; even with complete knowledge of a system’s initial state, only probabilistic predictions are possible. For example, the position of an electron is not precisely defined but described by a probability distribution, governed by the wave function Κ. This probabilistic nature is experimentally verified through observations like the double-slit experiment, demonstrating that particles do not follow definite trajectories but exhibit wave-like behavior with probabilities dictating their detection.

The Path Integral Formulation, developed by Richard Feynman, calculates the probability of a quantum event by considering all possible paths a particle could take between two points in spacetime. Unlike classical mechanics which defines a single, deterministic trajectory, and even early quantum mechanics relying on wave functions, this approach assigns an amplitude to each path, weighted by the exponential of iS/ħ, where S is the action and ħ is the reduced Planck constant. The overall probability is then obtained by summing (integrating) these amplitudes over all paths. Crucially, this summation inherently eliminates the need for a predefined “goal” or teleology; the particle doesn’t choose a single path based on minimizing time or energy, but effectively explores all possibilities simultaneously, with the resulting probability distribution dictating observed behavior. This provides a complete and mathematically rigorous framework for understanding quantum phenomena like interference and tunneling, offering a fundamentally different perspective on quantum evolution than wave function-based approaches.

The applicability of classical causality at the quantum level is contested due to the probabilistic nature of quantum mechanics. In classical physics, a definite cause precedes a definite effect; however, quantum events are fundamentally described by probability distributions, meaning only the likelihood of an outcome can be predicted with certainty. This challenges the notion of strict deterministic causality, where a known initial state necessitates a specific future state. While macroscopic systems generally exhibit behavior consistent with classical causality due to statistical averaging of quantum events, individual quantum phenomena suggest that the concept of a singular, definable cause may not hold. Investigations into quantum entanglement and retrocausality further complicate the understanding of temporal relationships between cause and effect, prompting ongoing research into alternative frameworks for describing causality at the quantum scale.

The Thermodynamic Arrow: Causality as an Emergent Property

The Second Law of Thermodynamics dictates that the total entropy of an isolated system can only increase over time, establishing a directionality often referred to as the ‘arrow of time’. Entropy, a measure of disorder or the number of possible microstates corresponding to a given macrostate, fundamentally links thermodynamic processes to causality. While the underlying microscopic laws of physics are generally time-symmetric – meaning they function identically whether time is moving forward or backward – the Second Law introduces an asymmetry at the macroscopic level. This asymmetry manifests as irreversible processes; events proceed in a specific temporal order because the reverse process would require a decrease in entropy, a statistically improbable outcome for isolated systems. Consequently, cause precedes effect because the causal event represents a lower entropy state transitioning to a higher entropy state, aligning with the thermodynamic directionality imposed by the Second Law.

Statistical mechanics provides a framework for understanding macroscopic phenomena, such as causality, by relating them to the underlying microscopic behavior of constituent particles. Unlike deterministic, time-reversible microscopic laws – which govern individual particle interactions – macroscopic causality arises from considering the statistical behavior of vast ensembles of particles. This approach demonstrates that macroscopic properties are not inherent in individual particles but emerge from their collective interactions and the probabilities governing those interactions. Specifically, causality isn’t a fundamental law, but a consequence of observing systems with specific initial conditions and utilizing coarse-grained descriptions that average over microscopic details; the increase in entropy, a statistical tendency, is then perceived as a directionality – the ‘arrow of time’ – and thus a causal sequence.

Causality, as observed in macroscopic systems, is not a foundational principle of physics but an emergent property. This emergence stems from a specific confluence of factors: the low-entropy initial conditions of the universe, the underlying time-symmetric nature of fundamental physical laws, and the application of coarse-grained descriptions to complex systems. While microscopic interactions may be reversible in time, macroscopic behavior exhibits a distinct directionality due to the statistical improbability of returning to initial low-entropy states. Coarse-graining, the process of averaging over microscopic details, further obscures reversibility and reinforces the perception of causal relationships, effectively defining a statistical arrow of time that governs observable phenomena.

Beyond Linearity: Reframing Causality as an Organizational Principle

The conventional understanding of causality – that every effect has a prior cause operating within a fixed temporal order – faces compelling challenges when examined through the lenses of relativity, quantum mechanics, and thermodynamics. Relativity demonstrates that simultaneity is not absolute, meaning the order of events, and thus causal relationships, can be observer-dependent. Quantum mechanics introduces inherent probabilistic behavior, suggesting that effects are not always uniquely determined by preceding causes, but rather exist as a range of possibilities. Furthermore, thermodynamics reveals that complex systems exhibit emergent behavior – properties arising from interactions that aren’t predictable from the individual components – implying that causality itself might not be a bedrock principle of reality, but a macroscopic property emerging from the statistical behavior of these complex systems. This perspective shifts the focus from identifying a singular, definitive cause to understanding the network of interactions that give rise to observed effects, suggesting causality is a useful description for complex phenomena rather than a fundamental principle of reality.

Lagrangian and Hamiltonian mechanics, despite their success in predicting physical phenomena, subtly undermine conventional understandings of cause and effect. These frameworks don’t dictate a linear progression from initial conditions to outcomes; instead, they identify paths that extremize a quantity called the action – a concept rooted in variational principles. This means the universe doesn’t necessarily ‘choose’ a single causal chain, but rather navigates all possible paths, effectively ‘sampling’ them to find the most efficient one. Furthermore, Noether’s Theorem, a cornerstone of these mechanics, reveals a profound connection between symmetries in a system and conserved quantities – such as energy or momentum. This implies that causality isn’t simply about one event preceding another, but is deeply interwoven with the underlying symmetries of the universe; a change in symmetry fundamentally alters the possible causal relationships. Consequently, these mechanics suggest that causality isn’t a rigid, pre-ordained rule, but an emergent property arising from the interplay of symmetry and the minimization of action, offering a nuanced and potentially non-deterministic view of how events unfold.

The conventional view of causality, long considered a bedrock principle governing the universe, is undergoing a significant re-evaluation, as this paper details. Rather than an inherent, fundamental law dictating that every effect has a prior cause, causality appears to be an emergent property – a phenomenon arising from the complex interactions within systems. This conceptual evolution doesn’t negate the practical utility of cause-and-effect reasoning, but reframes it as a statistical tendency observed at macroscopic scales. Investigations into relativity, quantum mechanics, and thermodynamics suggest that the strict, linear causality assumed in classical physics breaks down at extreme conditions, hinting that causality is not a pre-existing condition of the universe but a consequence of its organization. The implications of this shift extend beyond theoretical physics, influencing perspectives on determinism, free will, and the very nature of reality itself, positioning causality not as a ‘given’ but as a derived characteristic of complex systems.

The exploration of causality, as detailed within the paper, reveals a shift from viewing it as an inherent law of the universe to recognizing its emergent nature. This echoes Galileo Galilei’s observation: “You cannot teach a man anything; you can only help him discover it himself.” The paper posits that causality isn’t dictated to the universe, but discovered within its mathematical and thermodynamic structures. Like understanding motion through observation, the authors demonstrate how causality arises from the interplay of spacetime and irreversibility, particularly at macroscopic scales. It is not a pre-ordained principle, but a consequence of the system’s evolution, mirroring the process of uncovering natural laws rather than imposing them.

The Horizon of Influence

The diminishing of causality as a foundational tenet-its relegation to an emergent property-does not invalidate the predictive power of physical laws. Rather, it reframes the question. Every failure is a signal from time, indicating a limit to the system’s ability to maintain order against the inevitable increase of entropy. The persistence of macroscopic causality, then, isn’t a given, but a local, temporary reprieve purchased by the second law. The true inquiry shifts toward understanding the precise mechanisms by which these macroscopic regularities arise from underlying, fundamentally acausal processes.

Future investigations must confront the uncomfortable implications of this perspective for fields like cosmology and quantum gravity. If causality is not inherent in the universe, but a consequence of specific conditions, what governed the very earliest moments? What role does spacetime itself play-is it a stage upon which causality unfolds, or an emergent property alongside it? The exploration of non-causal models, however unsettling, becomes not merely an intellectual exercise, but a necessity.

Refactoring is a dialogue with the past; each revision of our understanding reveals the assumptions baked into previous formulations. The path forward demands a willingness to abandon cherished intuitions, and to accept that the universe may operate according to principles far removed from those readily accessible to human perception. The pursuit is not to find causality, but to map the boundaries of its influence-to chart the regions where its grip loosens, and the underlying acausal reality is revealed.


Original article: https://arxiv.org/pdf/2601.00037.pdf

Contact the author: https://www.linkedin.com/in/avetisyan/

See also:

2026-01-05 22:58