Author: Denis Avetisyan
Researchers have demonstrated a quantum interferometer leveraging entanglement over a 20km fiber link, paving the way for significantly enhanced astronomical imaging capabilities.

A memory-assisted nonlocal interferometer using entangled photons improves sensitivity for long-baseline telescope applications.
Achieving high-resolution imaging often demands impractically large baselines in optical interferometry. This limitation motivates the development of novel quantum approaches, as demonstrated in ‘Memory-Assisted Nonlocal Interferometer Towards Long-Baseline Telescopes’, which reports a nonlocal interferometer leveraging entanglement between quantum memories. By successfully implementing a 20km fiber-link baseline and compensating for geometric delays equivalent to 1.5km, this work showcases enhanced sensitivity for optical interferometry using delocalized single-photon entanglement. Could this memory-assisted approach ultimately unlock new frontiers in astronomical observation and remote quantum sensing?
The Limits of Detection: A Challenge to Conventional Interferometry
Traditional interferometers, instruments that combine light waves to detect incredibly subtle changes, face inherent limitations when attempting to capture the faintest of signals. This sensitivity bottleneck arises because the strength of the signal diminishes as the distance between the light paths increases – a critical issue for applications like gravitational wave astronomy, where signals are often extraordinarily weak. Detecting these ripples in spacetime requires exceptionally long baselines – the effective distance between the detectors – but extending these baselines conventionally introduces significant noise and signal degradation. Consequently, the ability to discern genuine signals from background interference is severely hampered, restricting the scope of astronomical observations and the precision of measurements in other fields reliant on precise wave detection.
Traditional interferometers, instruments that rely on the interference of light or other waves to make precise measurements, face fundamental constraints when scaling to large distances. The core issue arises from signal degradation as the waves propagate along extended paths; subtle distortions and scattering diminish the signal’s strength and clarity. Equally problematic is maintaining phase coherence – the precise synchronization of the wave’s crests and troughs. Any loss of this synchronization introduces noise, effectively masking the faint signals these instruments are designed to detect. The longer the path length, the more pronounced these effects become, limiting the sensitivity and ultimately, the scope of observations possible with conventional designs. This is particularly critical in fields like gravitational wave astronomy, where signals are exceptionally weak and require instruments capable of discerning minute changes over vast distances.
Recent advancements in quantum technology have enabled the creation of a nonlocal interferometer boasting an unprecedented 20 km baseline, effectively sidestepping the limitations of traditional interferometry. This achievement hinges on the integration of quantum memories, which act as temporary storage for quantum information, allowing for the separation of the interfering beams over vast distances without immediate signal degradation. By storing and retrieving quantum states at distant nodes, researchers circumvent the phase coherence issues that plague conventional long-baseline interferometers. The result is a significant boost in sensitivity, opening new possibilities for detecting extremely faint signals in fields like gravitational wave astronomy and fundamental physics. This innovative approach not only expands the scale of interferometry but also paves the way for more complex quantum networks and precision measurements.
Quantum states, the fundamental carriers of information in advanced interferometry, are notoriously susceptible to environmental disturbances, rapidly losing the delicate phase coherence essential for precise measurements. This fragility demands sophisticated techniques to shield quantum information from decoherence, such as isolating the system or employing error correction protocols. Researchers are actively developing quantum memories – devices capable of storing quantum information for extended periods – alongside methods to amplify faint signals without disturbing the quantum state. Innovative approaches include squeezing techniques, which reduce noise below the standard quantum limit, and entanglement distribution, enabling nonlocal interferometry where quantum correlations enhance sensitivity. These combined strategies represent a critical frontier in pushing the boundaries of interferometric precision, ultimately unlocking the potential for detecting previously inaccessible phenomena.

Quantum Memories: Anchoring Long-Baseline Interferometry
Cold atomic ensembles function as quantum memories by leveraging the collective properties of laser-cooled atoms. These ensembles provide a medium for mapping incoming photonic quantum states onto the internal degrees of freedom of the atoms via the process of light-matter interaction. This allows for the temporary storage of quantum information carried by photons, effectively buffering against signal attenuation that naturally occurs in optical systems. The stored quantum state can then be retrieved on demand by re-emitting a photon with the original quantum information preserved, thereby mitigating signal loss and enabling applications requiring coherent manipulation of quantum information over extended timescales. The density and temperature of the atomic ensemble are critical parameters influencing storage time and retrieval efficiency.
The DLCZ (Duan, Lukin, Cirac, Zoller) protocol is a method for creating entanglement between photons and a collective atomic excitation within a cold atomic ensemble. This protocol relies on weak, off-resonant laser pulses to excite the ensemble, probabilistically creating a single excitation which is then mapped onto a photon. Crucially, the detection of a “herald” photon, emitted during this process, signals the successful creation of entanglement between the stored atomic excitation and the emitted photon. This heralded entanglement is a vital resource for quantum interferometry, as it allows for the distribution of non-classical correlations and the implementation of entanglement-enhanced measurement schemes. The probabilistic nature of the DLCZ protocol necessitates the use of post-selection based on the heralding event to ensure a reliable entangled state.
Quantum memories extend the capabilities of interferometers by enabling the storage of photons, thereby facilitating the accumulation of phase evolution over extended periods. Traditional interferometry is limited by the transit time of photons through the device; however, by temporarily halting and storing these photons within the quantum memory – a cold atomic ensemble – the effective interaction time, and consequently the accumulated phase shift, can be significantly increased. This allows for the simulation of a much longer interferometer arm length than physically present, improving the sensitivity of measurements dependent on phase differences. The total accumulated phase, $ \phi $, is proportional to the storage time, effectively multiplying the phase gain achievable with a given physical apparatus.
Maximizing the performance of quantum memories relies on precise control of the atomic ensemble’s characteristics. Specifically, maintaining long coherence times – the duration for which quantum information is preserved – and high storage efficiency are critical. Recent experiments utilizing cold atomic ensembles have demonstrated an entanglement retrieval efficiency of 26%, representing the proportion of initially stored entanglement successfully retrieved. This efficiency is directly influenced by factors such as atomic density, temperature, and magnetic field homogeneity, all of which require careful optimization to minimize decoherence and maximize the fidelity of the stored quantum state. Achieving higher efficiencies is an ongoing area of research, with improvements directly impacting the sensitivity and performance of quantum interferometers.

Precision Control: Ensuring Coherence in Extended Interferometry
Geometric Delay Compensation (GDC) is a critical component in long-baseline interferometry due to the propagation delay differences experienced by light traversing varying path lengths. In an interferometer with a baseline of 20 km, the time difference between signals arriving at each detector can be significant, potentially exceeding the coherence time of the light source. GDC functions by introducing a precisely controlled, equivalent optical delay to one arm of the interferometer, effectively aligning the wavefronts and enabling constructive interference. This is typically achieved through the use of adjustable mirrors or fiber optic cable lengths. Failure to accurately implement GDC results in a reduction of fringe visibility and introduces systematic errors in the measurement of the interference signal, limiting the sensitivity and accuracy of the interferometer.
Coincidence counting is a technique used to confirm the presence of quantum interference and to quantify the strength of that interference, expressed as visibility. This method involves detecting correlated photon pairs arriving at separate detectors within a defined time window. The rate of coincident events is then compared to the expected rate in the absence of interference. Visibility, $V$, is calculated from the coincident counts, providing a direct measure of the fringe contrast and thus the degree of quantum coherence. A higher coincidence count rate, and consequently a visibility approaching 1, indicates strong interference and a robust quantum signal, while values closer to 0 signify weak or absent interference.
Characterization of the thermal field, a significant source of background noise in interferometric measurements, is performed using the $g^{(2)}$ function, the second-order correlation function. This function quantifies the probability of detecting photons at different times and positions, allowing for the differentiation of coherent signals from incoherent background. Specifically, a $g^{(2)}$ value of 1 indicates a purely incoherent or thermal source, while values less than 1 signify some degree of coherence. Accurate modeling of the thermal field, informed by the measured $g^{(2)}$, is essential for effectively subtracting background noise and maximizing the signal-to-noise ratio in the detection of weak quantum signals.
Measurement precision in this interferometric system is directly linked to the quantification of $Fisher Information$ derived from the collected data. This metric provides a statistically rigorous method for determining the achievable accuracy of parameter estimation. Recent experiments have demonstrated a measured interference visibility of $0.51 \pm 0.04$ sustained over a 20 km baseline, indicating a high degree of coherence and signal quality. This level of visibility is crucial for maximizing the $Fisher Information$ and, consequently, the precision with which quantum states can be characterized and measured within the interferometer.

Towards Quantum Networks and Distributed Sensing: Expanding the Horizon
Recent advancements in memory-assisted interferometry are establishing a foundation for the development of robust quantum networks. This technique allows for the storage and retrieval of quantum information, effectively acting as a quantum memory that overcomes limitations imposed by signal loss over distance. By storing quantum states, these networks can bypass the need for direct transmission of fragile qubits, enabling the creation of long-distance entanglement – a critical resource for secure communication and distributed quantum computing. The ability to reliably store and retrieve quantum information represents a significant step towards realizing a quantum internet, where information is transmitted with unparalleled security and computational power, extending the reach of quantum technologies beyond the limitations of classical infrastructure. These networks promise not only secure data transmission but also the potential for linking quantum processors, creating a powerful distributed quantum computer with capabilities exceeding those of any single machine.
The advent of quantum networks promises a paradigm shift in information processing, extending computational power beyond the limitations of single quantum computers. These networks envision a distributed architecture where multiple quantum nodes, each possessing computational capabilities, are interconnected via quantum channels. This allows for the partitioning of complex algorithms, enabling calculations that are intractable for even the most powerful classical or single quantum systems. Furthermore, such networks inherently offer enhanced security through quantum key distribution (QKD), guaranteeing secure communication by leveraging the laws of physics – any attempt to eavesdrop on the quantum channel inevitably introduces detectable disturbances. The potential extends beyond computation and communication; distributed quantum sensors, networked across vast distances, could achieve unprecedented precision in measurements, opening new frontiers in fields like astronomy and fundamental physics, by correlating measurements and reducing noise through quantum entanglement.
The practical implementation of long-distance quantum communication is fundamentally limited by signal loss in transmission media. To overcome this, researchers are actively developing quantum repeater technologies, which function much like classical repeaters but operate on the principles of quantum mechanics. Unlike classical repeaters that simply amplify a signal – and any accompanying noise – quantum repeaters utilize quantum entanglement and error correction to faithfully extend the range of quantum information transfer. These systems employ intermediate nodes that perform entanglement swapping and purification, effectively reconstructing the quantum state over longer distances without compromising its integrity. This approach circumvents the no-cloning theorem, which prohibits perfect copying of unknown quantum states, and enables the creation of a scalable quantum internet capable of secure communication and distributed quantum computing across geographically separated locations. The successful integration of quantum repeaters promises to unlock the full potential of quantum key distribution and other advanced quantum protocols.
The potential of quantum networks extends significantly beyond secure communication, promising substantial advancements in precision measurement through enhanced quantum sensing. Recent demonstrations showcase the capability to achieve a visibility of $0.39 \pm 0.01$ in interferometric measurements, sustained over a $60$ nanosecond window—a crucial timeframe for detecting subtle phenomena. This heightened sensitivity opens new avenues for applications in gravitational wave detection, where identifying ripples in spacetime requires extreme precision, and in fundamental physics research, enabling investigations into previously inaccessible quantum effects. By leveraging entanglement and quantum interference, these sensing capabilities surpass classical limitations, promising a new era of highly accurate and detailed observations of the physical world.
The pursuit of increasingly complex instruments often obscures a fundamental truth: elegance lies in reduction. This research, demonstrating a nonlocal interferometer across 20km of fiber, exemplifies that principle. They’ve bypassed the need for physically transporting fragile quantum states by leveraging entanglement within quantum memories – a solution that feels less like engineering and more like a clever sidestep. As Paul Dirac once observed, “I have not failed. I’ve just found 10,000 ways that won’t work.” The iterative process of finding what doesn’t work, meticulously stripped of unnecessary layers, ultimately yielded a system capable of improved sensitivity for long-baseline telescopes. It’s a reminder that sometimes, the most profound advancements arise not from adding complexity, but from bravely subtracting it.
Further Horizons
The demonstration of interferometry across 20km, while significant, merely clarifies the scale of remaining challenges. Sensitivity gains, predictably, remain tethered to memory coherence times. Extending baseline length does not, in itself, solve the decoherence problem; it amplifies it. Future iterations will necessitate not simply better memories, but fundamentally different approaches to quantum storage – perhaps accepting probabilistic read-out as a necessary cost.
Current limitations are not optical, but architectural. Scaling this beyond a single link demands addressing the complexities of quantum repeaters – a task currently stalled by the difficulty of creating truly entangled states with high fidelity and low loss. The pursuit of topological protection in quantum memories offers a potential, if distant, pathway. Simpler, however, may be a pragmatic acceptance of loss, combined with increasingly sophisticated error correction schemes.
Ultimately, the utility of nonlocal interferometry hinges on demonstrating a quantifiable advantage over classical techniques. The promise of enhanced angular resolution for astronomical imaging, or increased precision in fundamental constant measurements, remains theoretical. Clarity is the minimum viable kindness; the field must now deliver empirical evidence to justify the inherent complexities.
Original article: https://arxiv.org/pdf/2511.10988.pdf
Contact the author: https://www.linkedin.com/in/avetisyan/
See also:
- Gold Rate Forecast
- How To Watch Under The Bridge And Stream Every Episode Of This Shocking True Crime Series Free From Anywhere
- Silver Rate Forecast
- BrokenLore: Ascend is a New Entry in the Horror Franchise, Announced for PC and PS5
- Taming Quantum Chaos: A Stochastic Approach to Many-Body Dynamics
- 7 1990s Sci-fi Movies You Forgot Were Awesome
- Two DC Comics Characters Have Lifted Thor’s Hammer This Week (And Everyone Missed It)
- Sony to Stimulate Japanese PS5 Sales with Cheaper, Region-Locked Model
- 🚀 XRP to $50K? More Like a Unicorn Riding a Rainbow! 🌈
- Valve’s new Steam Machine is just a PC at heart — here’s how to build your own and how much it will cost
2025-11-17 17:41