Entangled Light from a Quantum Dot

Author: Denis Avetisyan


Researchers have demonstrated a robust source of entangled photons using a novel quantum dot technique, paving the way for more efficient quantum communication systems.

The observation of Franson interference fringes and a violation of the CHSH inequality demonstrates time-entanglement, evidenced by a correlation function visibility of 92.8±2.6% and a pump power dependence consistent with theoretical predictions, even at an average input photon number of 0.01.
The observation of Franson interference fringes and a violation of the CHSH inequality demonstrates time-entanglement, evidenced by a correlation function visibility of 92.8±2.6% and a pump power dependence consistent with theoretical predictions, even at an average input photon number of 0.01.

This work reports the violation of the CHSH Bell inequality using time-bin entanglement generated from vacuum-one-photon superposition states emitted by a resonant fluorescence quantum dot.

While generating entangled photons typically necessitates complex multi-photon processes, this work, ‘Bell Inequality Violation with Vacuum-One-Photon Number Superposition States’, demonstrates a novel approach utilizing resonant fluorescence from a single quantum dot to directly create vacuum-one-photon superposition states. We observe a clear violation of the Clauser-Horn-Shimony-Holt Bell inequality via Franson-type interferometry, confirming the generation of time-bin entanglement without the need for multiphoton generation. Does this simplified scheme pave the way for brighter, more scalable solid-state sources essential for long-distance quantum communication?


Unveiling the Quantum Realm: Foundations of Superposition

Quantum information science fundamentally relies on the principle of superposition, extending beyond simple binary bits to harness the full potential of quantum states. This is elegantly achieved through Fock states, which describe the number of photons in a specific mode of the electromagnetic field. Unlike classical bits limited to 0 or 1, a quantum bit, or qubit, can exist as a combination of these states – a superposition. Crucially, Fock states aren’t limited to representing just 0 or 1 photons; they can represent any number, effectively creating an unbounded computational space. This means the potential for representing complex information grows exponentially with each added photon, offering a pathway to solve problems intractable for classical computers. The ability to manipulate and control these states, represented mathematically as $ |n\rangle $ where ‘n’ is the number of photons, forms the bedrock for advanced quantum algorithms and communication protocols.

Vacuum-one-photon states represent a particularly promising foundation for quantum information processing due to their inherent resilience and precise controllability. Constructed from Fock states – which describe specific numbers of photons – these states leverage the quantum vacuum as a crucial component, effectively encoding information in the presence or absence of a single photon. This approach minimizes decoherence, a significant challenge in quantum computing, as the information is less susceptible to environmental noise compared to systems relying on multiple photons or material qubits. Furthermore, the well-defined energy levels and readily manipulated properties of single photons allow for precise control over qubit states and the creation of complex entanglement, essential for secure quantum communication and powerful quantum computation. Researchers are actively developing methods to generate, manipulate, and detect these states with increasing fidelity, paving the way for scalable and robust quantum technologies utilizing the fundamental properties of light.

The realization of robust quantum communication networks hinges on the ability to create and manipulate quantum bits, or qubits, and vacuum-one-photon states provide a particularly promising avenue for achieving this with photons. These states, representing the absence or presence of a single photon, serve as ideal carriers of quantum information due to their resilience against decoherence and ease of transmission through optical fibers. Crucially, by precisely controlling these states, researchers can generate complex entangled states – where two or more photons become inextricably linked, regardless of the distance separating them. This entanglement is the fundamental resource driving quantum key distribution, teleportation, and other protocols that promise unconditionally secure communication and distributed quantum computation. The creation of these entangled states, therefore, is not merely a technological feat, but the essential building block for a future quantum internet.

A quantum light source, resonantly driven and split into two time-bin analyzers with stabilized phases, enables coincidence counting using single-photon detectors and characterization of photon statistics via RF spectroscopy and autocorrelation measurements.
A quantum light source, resonantly driven and split into two time-bin analyzers with stabilized phases, enables coincidence counting using single-photon detectors and characterization of photon statistics via RF spectroscopy and autocorrelation measurements.

Engineering Quantum Light: From Quantum Dots to Entanglement

Quantum dot (QD) micropillar devices function as efficient single-photon sources due to the quantum confinement effect within the semiconductor nanocrystal. When a resonant continuous wave laser excites the QD, electrons are promoted to higher energy levels. Radiative recombination from these excited states results in the emission of photons. The micropillar structure enhances light extraction efficiency and provides a resonant cavity, increasing the probability of photon emission and suppressing multi-photon events. This configuration ensures a high flux of individual, indistinguishable photons, critical for applications in quantum cryptography, quantum computing, and quantum communication. The emission wavelength is determined by the size and composition of the quantum dot, allowing for tunability across a broad spectral range.

Radio frequency (RF) excitation of quantum dots allows for the generation of both vacuum and single-photon states exhibiting temporal de-localization. This technique manipulates the quantum dot’s emission characteristics beyond those achievable with continuous wave excitation alone. By applying RF signals, the temporal spread of the emitted photons can be precisely controlled, influencing the duration and shape of the wave packet. This control is achieved through the modulation of the quantum dot’s dipole moment, which affects the phase coherence of the emitted photons. The ability to tailor the temporal properties of these states is critical for applications requiring precise timing, such as long-distance quantum communication and quantum information processing, as it minimizes decoherence effects related to photon indistinguishability.

Resonance fluorescence, achieved by exciting a quantum dot with a laser tuned to its exciton resonance, is central to generating the superposition states required for entanglement. This process involves the continuous absorption and re-emission of photons, creating a coherent state where the quantum dot exists in a superposition of its ground state (vacuum) and excited state (single photon emission). By carefully controlling the excitation parameters – specifically the laser power and detuning – it is possible to manipulate the relative amplitudes of these states. This precisely prepared superposition, representing a probabilistic mixture of vacuum and single-photon states, serves as the initial quantum state upon which subsequent operations are performed to create entangled photon pairs. The efficiency of preparing these superpositions directly impacts the fidelity of the resulting entanglement.

The second-order correlation function, $g^{(2)}(0)$, quantitatively assesses the clustering of photons in a light source; a value of 0 indicates an ideal single-photon source with Poissonian statistics, while values greater than 0 and less than 1 demonstrate non-classical light emission. Measurement of $g^{(2)}(0) = 0.037(3)$ at low excitation power confirms the non-classical character of the light emitted from the quantum dot-micropillar devices. This value, significantly below 1, indicates strong photon antibunching, meaning photons are emitted one at a time with a suppressed probability of coincident photon detection, thus validating the device’s functionality as a source of non-classical light.

The measured and simulated second-order correlation functions reveal phase-dependent interference visibility in resonance fluorescence, demonstrating how the coincidence count rate varies with the relative phase between detectors.
The measured and simulated second-order correlation functions reveal phase-dependent interference visibility in resonance fluorescence, demonstrating how the coincidence count rate varies with the relative phase between detectors.

Verifying Entanglement: A Time-Bin Approach to Quantum Correlation

Time-bin entanglement, a specific form of polarization entanglement, utilizes vacuum and single-photon states to encode quantum information in the time of arrival of a photon. This encoding is achieved by superposing a photon in two distinct time bins, effectively creating a superposition of being present in either an early or late time slot. This approach is particularly valuable for quantum key distribution (QKD), quantum teleportation, and other quantum information protocols due to its resilience against decoherence from fiber optic transmission. The separation in time, rather than spatial polarization, minimizes the impact of environmental disturbances on the fragile quantum state, allowing for secure communication over longer distances. Furthermore, time-bin entanglement is compatible with standard telecom wavelengths, facilitating integration with existing fiber optic infrastructure.

Entangled states for time-bin encoding are prepared utilizing fiber beam splitters and Asymmetric Mach-Zehnder interferometers (AMZIs). Fiber beam splitters are employed to superimpose photon paths, creating the necessary quantum superposition for entanglement. AMZIs, distinguished by having unequal path lengths in their arms, introduce a time delay between the two possible paths of a photon. This time difference is crucial for defining the time bins used in the entanglement process. By precisely controlling the path lengths and employing phase modulation, the AMZIs generate the required temporal correlation between photons, effectively creating the entangled state used for quantum information protocols. The asymmetry in the interferometer arms is fundamental for distinguishing between the early and late time bins, enabling projective measurements in the time domain.

The Franson interferometer configuration enables entanglement verification by performing projective measurements in the time-bin basis. This is achieved by directing the entangled photon pairs from the Asymmetric Mach-Zehnder interferometers (AMZIs) into the Franson interferometer, which recombines the photons and allows for two-photon interference. Successful interference, detected using single-photon detectors, demonstrates the correlation between the arrival times of the photons, a key signature of entanglement. Specifically, coincident detection rates exceeding classical limits confirm the non-classical correlations and validate the creation of entangled states in the time-bin degree of freedom. The observed interference pattern is directly related to the indistinguishability of the photons in the time-bins, providing a quantitative measure of entanglement fidelity.

Asymmetric Mach-Zehnder interferometers (AMZIs) used in time-bin entanglement experiments are highly sensitive to environmental fluctuations, which can degrade the coherence of the entangled states. Phase locking techniques actively stabilize the relative phase between the two paths of the AMZI, typically by employing feedback loops that monitor and correct phase drift. These techniques commonly utilize piezoelectric transducers to adjust mirror positions or fiber lengths, maintaining constructive and destructive interference at the desired time bins. Precise phase control is critical; deviations from the optimal phase induce errors in the entangled state preparation, reducing the visibility of interference fringes and ultimately lowering the fidelity of quantum information processing. The effectiveness of phase locking is directly correlated to the stability of the entanglement and the success of subsequent quantum operations.

A visibility of interference fringes measuring $92.8 \pm 2.6\%$ was experimentally obtained using the Franson interferometer and single-photon detectors. This high-visibility interference directly confirms the presence of strong time-bin entanglement between the photon pairs generated. Visibility, in this context, quantifies the contrast between constructive and destructive interference; values approaching 1.0 indicate a strong correlation between the entangled photons, and thus a high degree of entanglement fidelity. The reported value demonstrates the efficacy of the experimental setup and the quality of the generated entangled states for use in quantum information applications.

Confirming Quantum Correlations: The Power of the CHSH Bell Inequality

The cornerstone of quantum mechanics, entanglement, received compelling validation through the observed violation of the Clauser-Horne-Shimony-Holt (CHSH) Bell inequality. This inequality, rooted in the principles of local realism – the idea that objects possess definite properties independent of measurement and that influences cannot travel faster than light – sets a limit on the correlations achievable by any theory adhering to these tenets. Experiments consistently demonstrate that entangled particles exhibit correlations exceeding this classical bound, as evidenced by a CHSH ‘S’ parameter greater than 2. This isn’t merely a statistical anomaly; the observed value definitively proves that the particles are linked in a fundamentally non-local way, challenging the intuitive notion that physical properties are predetermined before measurement. The violation effectively rules out local realistic theories as a viable explanation for the behavior of entangled systems, solidifying the counterintuitive, yet experimentally verified, predictions of quantum mechanics and opening doors to technologies leveraging this uniquely quantum phenomenon.

The experimental findings are not merely a demonstration of quantum strangeness, but a powerful confirmation of the underlying theoretical framework. A precise pure-state model, based on the principles of quantum mechanics, accurately predicted the observed correlations in the entangled photon pairs. This predictive success extends beyond simply showing that entanglement exists; it verifies the specific quantum state prepared and measured, validating the mathematical description of the system. The close agreement between the model’s predictions and the experimental data – encompassing the observed violation of the CHSH Bell inequality and the characteristics of the emitted light – establishes a high degree of confidence in the theoretical understanding of these entangled states, and reinforces the potential for their reliable utilization in emerging quantum technologies. The model’s ability to reproduce the results, down to quantifiable parameters, demonstrates that the observed phenomena are not due to hidden variables or local realistic explanations, but are genuinely rooted in the principles of quantum mechanics.

Measurements of the second-order correlation function, $g^{(2)}(\tau)$, provide compelling evidence for the non-classical nature of the emitted light. A value of $g^{(2)}(\tau)$ less than one indicates photon antibunching, a phenomenon fundamentally incompatible with classical light sources which always exhibit Poissonian statistics or greater. This characteristic signifies that photons are emitted one at a time, reinforcing the quantum mechanical description of the light source and confirming the genuine single-photon character crucial for applications in quantum technologies. Analyzing $g^{(2)}(\tau)$ not only validates the non-classicality but also allows for a detailed characterization of the quantum state, providing insights into the coherence and purity of the entanglement, and ultimately ensuring the reliability of quantum information processing.

The confirmed existence of robustly entangled states, demonstrated through violation of the CHSH Bell inequality, is not merely a fundamental confirmation of quantum mechanics, but a crucial stepping stone towards practical quantum technologies. These highly correlated quantum states are essential resources for protocols like quantum key distribution, offering provably secure communication channels, and for building quantum computers capable of solving problems intractable for classical machines. The increased brightness and efficiency of these entangled sources, exceeding previous implementations, directly addresses a key challenge in scaling up quantum systems – maintaining a sufficient signal for reliable operation. This advancement enables the exploration of more complex quantum algorithms and the development of practical quantum networks, potentially revolutionizing fields ranging from cryptography and materials science to drug discovery and artificial intelligence.

Recent experiments have definitively demonstrated a violation of the Clauser-Horne-Shimony-Holt (CHSH) Bell inequality, yielding a value of $S = 2.675(50)$ which significantly surpasses the classical limit of 2. This result isn’t merely a statistical fluctuation; the observed violation exceeds the classical bound by over 13 standard deviations, providing compelling evidence for the existence of quantum entanglement and ruling out any explanation based on local realism. Importantly, this demonstration also achieved a substantial increase in the brightness of the entangled photon source, exceeding the performance of previous radio-frequency-based implementations by more than two orders of magnitude, paving the way for more practical applications of this fundamental quantum phenomenon in fields like secure communication and advanced computation.

The pursuit of demonstrable entanglement, as evidenced in this work with vacuum-one-photon superposition states, reveals a fundamental truth about interconnectedness. It’s a beautifully fragile construction, demanding precision at every level. As Max Planck observed, “A new scientific truth does not triumph by convincing its opponents and proclaiming that they are irrational. But rather it will be found by a new generation that grew up with it.” This resonates with the meticulous process of verifying the CHSH Bell inequality; the system doesn’t simply appear to violate classical constraints, it must be consistently demonstrated across numerous measurements. If the system looks clever, it’s probably fragile, and in quantum mechanics, the elegance of the design often hinges on the careful isolation of what must be sacrificed to achieve a coherent state.

What Lies Ahead?

The demonstration of Bell inequality violation with these vacuum-one-photon superposition states, while elegant, merely clarifies the landscape of what remains unknown. The source itself, bright as it is, presents the familiar bottlenecks of any physical system – collection efficiency, spectral purity, and the ever-present demand for greater integration. Scalability isn’t achieved through increasingly complex apparatus, but through simplification – a relentless pruning of unnecessary components. The true metric isn’t photon count rate, but the fidelity of the entangled state across a complex network.

The ecosystem of quantum communication demands more than just bright sources. It requires robust detectors, efficient memories, and error correction protocols that don’t obliterate the fragile entanglement they seek to preserve. The current focus on individual components risks obscuring the holistic behavior of the system. A truly scalable architecture will likely emerge not from incremental improvements to existing technologies, but from a fundamental rethinking of how information is encoded and transmitted.

The persistent challenge remains: how to bridge the gap between the pristine conditions of a laboratory and the noisy reality of a global network. Entanglement, after all, is a delicate structure, easily disrupted by the complexities of the world. The path forward isn’t necessarily about creating stronger entanglement, but about designing systems resilient enough to tolerate imperfection. The clarity of the underlying physics suggests this is achievable, if one remembers that structure always dictates behavior.


Original article: https://arxiv.org/pdf/2511.15413.pdf

Contact the author: https://www.linkedin.com/in/avetisyan/

See also:

2025-11-20 13:56