Why Quantum Mechanics Needs Relativity

Author: Denis Avetisyan


New research reveals that the fundamental rules governing quantum behavior aren’t arbitrary, but a natural consequence of enforcing the principles of relativity.

Relativistically invariant probability functions adhere to three core principles: pairwise Kolmogorov additivity, ensuring probabilities of alternatives are determined solely by single and two-path contributions; time symmetry, maintaining invariance under time reversal; and Bayesian composition, enabling factorization of transition probabilities analogous to the Bayesian chain rule, thereby establishing a consistent framework for probabilistic reasoning within relativistic contexts.
Relativistically invariant probability functions adhere to three core principles: pairwise Kolmogorov additivity, ensuring probabilities of alternatives are determined solely by single and two-path contributions; time symmetry, maintaining invariance under time reversal; and Bayesian composition, enabling factorization of transition probabilities analogous to the Bayesian chain rule, thereby establishing a consistent framework for probabilistic reasoning within relativistic contexts.

The standard quantum-mechanical path-integral formulation emerges from imposing relativistic invariance and minimal probabilistic assumptions, resolving long-standing questions about its origins.

The foundations of quantum mechanics have long appeared disconnected from the principles of relativity, relying on seemingly arbitrary postulates regarding probability amplitudes. This paper, ‘The relativistic reason for quantum probability amplitudes’, resolves this tension by demonstrating that the standard quantum formalism-specifically the Feynman path integral-emerges naturally from imposing relativistic invariance on a probabilistic description of particle motion. By demanding only pairwise Kolmogorov additivity, time symmetry, and adherence to Bayes’ rule for multi-path histories, we derive the complex amplitudes central to quantum mechanics. Does this perspective offer a pathway towards unifying quantum mechanics and relativity, and ultimately, a deeper understanding of the universe?


The Quantum Riddle: Probability at the Edge of Reality

Despite its extraordinary predictive power and technological applications, quantum theory remains conceptually unsettling due to a lack of a fully robust probabilistic foundation. The theory routinely predicts outcomes as probabilities – the likelihood of finding an electron in a particular location, for example – but how these probabilities themselves arise is not fully explained within the framework of the theory. This isn’t merely a mathematical gap; it strikes at the heart of how reality operates, prompting questions about whether quantum probabilities represent fundamental aspects of the universe or simply reflect a lack of complete knowledge. Existing probabilistic frameworks, like Kolmogorov’s, struggle to seamlessly integrate with the core tenets of quantum mechanics, particularly concerning measurement and entanglement. Consequently, physicists and mathematicians are actively exploring new approaches to probability, seeking a foundation that can consistently and meaningfully describe the quantum realm and potentially reveal deeper insights into the nature of reality itself.

Despite its widespread success in modeling uncertainty, standard Kolmogorov probability theory encounters fundamental difficulties when applied to the realm of relativistic physics and quantum mechanics. This framework, built on the concept of a sample space and a probability measure, assumes a clear separation between events and the ability to assign probabilities based on frequencies of outcomes. However, relativity introduces challenges related to simultaneity and observer dependence, blurring the lines of what constitutes a definite event. More critically, quantum mechanics posits inherent indeterminacy – the probabilistic nature isn’t merely a reflection of incomplete knowledge, but a fundamental property of reality. Concepts like superposition and entanglement defy classical probabilistic descriptions, as assigning probabilities to individual outcomes before measurement becomes problematic. Attempts to force quantum phenomena into the Kolmogorov framework often lead to inconsistencies or require ad-hoc modifications, suggesting the need for a revised probabilistic foundation capable of seamlessly incorporating the principles of both relativity and quantum theory.

A robust reformulation of probability isn’t merely a mathematical exercise, but a fundamental requirement for a coherent physical worldview. Current probabilistic frameworks, while effective in many contexts, falter when applied to the relativistic scenarios and counterintuitive phenomena inherent in quantum mechanics. This disconnect manifests as conceptual inconsistencies – ambiguities in interpreting quantum states, challenges in defining objective probabilities, and difficulties in reconciling quantum measurements with classical intuition. Addressing these issues demands a probabilistic foundation capable of seamlessly integrating with the principles of special relativity and general relativity, potentially requiring a departure from traditional Kolmogorovian axioms. Successfully reconstructing this foundation promises not only to resolve long-standing paradoxes within quantum theory, but also to pave the way for a unified physical theory that consistently describes gravity, quantum mechanics, and the very nature of reality itself.

Relativity’s Constraints: Reshaping the Rules of Probability

The imposition of relativistic invariance on probability functions necessitates a departure from standard probability theory because conventional functions are not generally Lorentz invariant. This means that probability assignments are not preserved under Lorentz transformations – changes in reference frame due to constant relative velocity. Specifically, a probability distribution $P(x)$ defined in one frame must transform to a valid probability distribution in another frame related by a Lorentz transformation $\Lambda$, requiring $P'(x’) = P(\Lambda^{-1} x’)$ to satisfy $0 \le P'(x’) \le 1$ for all $x’$. Consequently, permissible probability functions are limited to those that remain well-defined and non-negative under boosts and spatial rotations, significantly restricting the forms they can take compared to the unconstrained case.

The imposition of time symmetry as a constraint on probability functions necessitates invariance under time reversal transformations, meaning the probability of a sequence of events must remain unchanged if the temporal order of those events is reversed. Mathematically, this implies that for any probability function $P(x_1, x_2, …, x_n)$, it must hold that $P(x_n, x_{n-1}, …, x_1) = P(x_1, x_2, …, x_n)$. This constraint significantly limits the permissible forms of probability distributions, particularly in scenarios where the underlying physics, such as quantum mechanics, does not inherently possess time-reversal symmetry. Consequently, standard probabilistic models relying on sequential updates or Markovian processes may require modification or replacement to comply with this fundamental symmetry requirement.

The imposition of relativistic invariance and time symmetry on probabilistic systems fundamentally challenges the axioms of classical probability theory. Traditional probability relies on the assumption of absolute time and well-defined observer frames, which are incompatible with the principles of special relativity. Consequently, concepts such as sequential conditioning – central to the Bayesian approach – require careful reformulation. The standard Bayesian composition rule, $P(A,B) = P(A)P(B|A)$, is not generally Lorentz invariant and thus cannot be directly applied to relativistic scenarios. This necessitates exploring alternative frameworks for updating probabilities, potentially involving non-additive measures or generalized Bayesian rules that accommodate the principles of relativistic invariance and the inherent symmetries of quantum mechanics, leading to a more nuanced and complex understanding of quantum probability.

The Bayesian Composition Rule, typically expressed as $P(A,B) = P(A|B)P(B)$, faces challenges when applied to relativistic and quantum systems due to the non-classical nature of state updates and information propagation. Standard Bayesian inference assumes a well-defined order of events and a separable system, both of which can be violated in relativistic scenarios involving entanglement or spacetime distortions. Specifically, the rule relies on conditional probabilities being unambiguously defined, but relativistic causality and the principles of quantum mechanics introduce correlations that invalidate the assumption of independent updates. Re-evaluation necessitates considering how state information is affected by Lorentz transformations and quantum measurement, potentially requiring modifications to the composition rule or alternative frameworks for probabilistic reasoning that account for non-classical correlations and the fundamental limitations on knowledge imposed by these physical theories.

The Path Integral: A Quantum Recipe for Probability

The Feynman Path Integral formulation of quantum mechanics arises from the requirement that probability calculations remain consistent under Lorentz transformations, a core tenet of special relativity. Traditional probability, based on the Kolmogorov axioms, does not inherently possess this invariance. To achieve relativistic invariance, the probability amplitude for a particle to propagate from one spacetime point to another must be calculated by summing contributions from all possible paths between those points, weighted by a phase factor proportional to the classical action, $S[\text{path}]$. This summation effectively integrates over all paths, and the resulting integral provides a probability amplitude that transforms correctly under boosts and rotations, satisfying the demands of relativistic invariance. Consequently, the Path Integral isn’t an arbitrary mathematical construction, but a direct result of imposing relativistic principles on probabilistic descriptions in quantum mechanics.

The probability amplitude in the Feynman path integral formulation is determined by weighting each possible path between initial and final states with a complex exponential factor. This weighting is directly proportional to the value of the action, $S$, evaluated for that specific path. Crucially, the action is a path invariant – its value remains constant regardless of coordinate transformations along the path. Similarly, proper time, $\tau$, which represents the time measured by an observer moving along the path, is also a path invariant. The integral over all possible paths, each weighted by $e^{iS/\hbar}$ or a similar factor incorporating proper time, then yields the overall probability amplitude for the quantum event. The invariance of these quantities under path deformations is fundamental to the mathematical consistency and physical interpretation of the path integral formalism.

The standard Kolmogorov axioms of probability, while sufficient for describing classical systems, prove inadequate when applied to the Feynman path integral formulation of quantum mechanics. This inadequacy stems from the path integral’s summation over all possible spacetime paths, including those not permissible in classical physics, and the resulting probability amplitudes which can be complex numbers. Consequently, theories of Non-Classical Probability are required to provide a consistent mathematical framework. These theories relax or modify the standard axioms, often allowing for negative probabilities or interference effects that violate the law of total probability as classically understood. Specifically, the requirement of unitarity in quantum mechanics necessitates a probabilistic framework that extends beyond the limitations of strictly positive probability measures, leading to investigations into alternative axiom systems and generalized probability measures like those based on positive operator-valued measures (POVMs).

The consideration of superluminal frames of reference, while presenting apparent violations of causality, serves as a valuable method for identifying the limits of permissible probabilistic descriptions in quantum mechanics. By mathematically exploring scenarios where information appears to propagate faster than light, researchers can delineate the conditions under which standard probability axioms, particularly those related to non-negative probabilities and normalization, break down. This approach does not posit the physical reality of superluminal signaling, but rather uses these frames as a theoretical construct to test the robustness of probabilistic frameworks and to define the boundaries of valid quantum mechanical descriptions. Specifically, analyzing probability amplitudes in these frames reveals constraints on the permissible forms of probability distributions and highlights the necessity of potentially revising or extending the Kolmogorov axioms to accommodate quantum phenomena, such as tunneling or entanglement, which lack classical probabilistic analogs.

Pairwise Additivity: Towards a Consistent Quantum Picture

The mathematical structure underpinning quantum mechanics demands a rigorous consistency, and imposing pairwise Kolmogorov additivity is a crucial step in achieving this. This principle restricts the contributions to probabilistic interference to only those arising from pairs of events, effectively eliminating any higher-order interference terms. By limiting interactions in this way, the derived probability functions remain mathematically well-defined and avoid the inconsistencies that plague alternative formulations. This simplification isn’t merely a mathematical trick; it directly addresses potential paradoxes that arise when considering the joint probability of multiple quantum events. The resulting framework ensures that probabilities remain non-negative and sum to one, upholding the fundamental axioms of probability theory and providing a solid foundation for a consistent quantum picture of reality, where $P(A \cup B) = P(A) + P(B)$ holds true for all non-overlapping events A and B.

The theoretical framework, built upon principles of pairwise additivity, establishes a robust basis for interpreting complex probability amplitudes – mathematical objects fundamentally linked to the quantum portrayal of reality. These amplitudes, differing from classical probabilities, can be positive, negative, or even imaginary, and crucially, dictate the likelihood of observing specific outcomes in quantum events. This approach doesn’t merely describe what happens, but explains how these amplitudes combine and interfere, determining probabilities through calculations involving the square of their absolute values, represented mathematically as $P = |A|^2$. By consistently applying pairwise additivity, the theory ensures that these complex interactions are mathematically sound, offering a pathway to reconcile the abstract nature of quantum amplitudes with observable physical phenomena and potentially bridging the gap between quantum mechanics and broader physical principles.

The developed theoretical framework doesn’t merely offer a mathematically consistent approach to quantum probability; it proposes a potential resolution to long-standing conceptual challenges within quantum mechanics. By rigorously limiting interference to pairwise contributions, the theory sidesteps paradoxes arising from the superposition principle and entanglement, offering a more intuitive grasp of quantum phenomena. Furthermore, this consistency isn’t achieved in isolation; the principles underpinning this model align with broader physical considerations, suggesting a pathway toward unifying quantum mechanics with other fundamental forces and principles. The framework’s emphasis on pairwise additivity and the rejection of higher-order interference terms-where the joint probability of three or more events is zero-provides a foundational structure potentially compatible with emergent phenomena and classical limits, fostering a more holistic understanding of reality where quantum and classical descriptions seamlessly converge.

The mathematical structure of this theory hinges on a fundamental simplification: the complete absence of interference between three or more events. The derivation explicitly establishes that any term representing the joint probability of such occurrences evaluates to zero. This isn’t merely a computational convenience, but a core tenet of the framework; it dictates that the overall probability of a combined outcome is strictly determined by the sum of pairwise interactions. Consequently, the complex interplay of quantum phenomena is reduced to a manageable and consistent set of two-event probabilities, effectively eliminating the paradoxical effects arising from higher-order interference and bolstering the theory’s internal coherence. This simplification allows for a mathematically sound and conceptually streamlined approach to understanding complex probability amplitudes, offering a potential resolution to long-standing difficulties within quantum mechanics.

Kolmogorov additivity is illustrated using Venn diagrams demonstrating that the probability of the union of disjoint events is the sum of their individual probabilities, applicable to both two and three events with no overlapping outcomes.
Kolmogorov additivity is illustrated using Venn diagrams demonstrating that the probability of the union of disjoint events is the sum of their individual probabilities, applicable to both two and three events with no overlapping outcomes.

The pursuit of a logically consistent quantum mechanics, as demonstrated in this work concerning relativistic invariance and probability amplitudes, echoes a fundamental principle of responsible innovation. This paper reveals how seemingly arbitrary rules, like the Feynman path-integral, arise from deeper physical constraints-a principle mirrored in the ethical design of algorithms. As Louis de Broglie noted, ā€œIt is in the interplay between theory and experiment that we discover the true nature of reality.ā€ The research highlights that the mathematical framework isn’t simply imposed, but emerges from the demand for consistency between quantum mechanics and the principles of relativity-a testament to the idea that every system encodes a worldview and bears responsibility for its outcomes. The core concept of deriving quantum rules from fundamental principles directly counters the notion of unchecked technological advancement.

The Road Ahead

This work, while establishing a compelling link between relativistic invariance and the foundations of quantum mechanics, does not dissolve the inherent ambiguities within probabilistic interpretation. The insistence on Kolmogorov additivity, a mathematically convenient but physically motivated assumption, remains a point of contention. Every bias report is society’s mirror; this insistence on a particular mathematical structure invites scrutiny of why that structure is favored, and what alternatives might be equally, or more, reflective of physical reality. The demonstrated emergence of complex amplitudes does not, in itself, resolve the measurement problem; it merely shifts the locus of inquiry.

Future investigations must confront the implications of superluminal frames, not as mathematical curiosities, but as potential signposts toward a deeper understanding of non-locality. The comfortable assumption of a universal spacetime, while pragmatically useful, may prove insufficient. The field needs to address how these frameworks accommodate, or preclude, alternative probabilistic structures beyond Kolmogorov’s axioms.

Privacy interfaces are forms of respect; similarly, a truly complete theory will acknowledge the inherent limits of observability and the unavoidable role of the observer. The pursuit of a fully relativistic quantum theory is not simply a technical exercise; it is an exercise in defining the boundaries of knowledge itself.


Original article: https://arxiv.org/pdf/2512.10497.pdf

Contact the author: https://www.linkedin.com/in/avetisyan/

See also:

2025-12-13 09:25