Author: Denis Avetisyan
The CMS experiment continues to push the boundaries of particle physics, meticulously analyzing data for evidence of phenomena beyond our current understanding.

A comprehensive review of recent searches for new physics in final states containing leptons conducted by the CMS collaboration at the Large Hadron Collider.
Despite the remarkable success of the Standard Model, fundamental questions regarding the nature of mass, dark matter, and neutrino properties remain unanswered, motivating searches for physics beyond its predictions. This paper, ‘Searches in CMS for New Physics in Final States with Leptons’, presents recent results from the CMS experiment at the Large Hadron Collider, focusing on signatures involving leptons-electrons, muons, and taus-as probes for new phenomena. While no statistically significant evidence for beyond-the-Standard-Model processes has been observed, these searches have established stringent limits on a variety of theoretical models, including those positing new resonances or exotic decays. What further refinements to search strategies and data analysis techniques will be necessary to unlock the secrets of the universe beyond our current understanding?
The Illusion of Completeness
Despite its extraordinary predictive power and decades of experimental validation, the Standard Model of particle physics remains incomplete. It fails to incorporate gravity, offers no explanation for the observed abundance of dark matter and dark energy, and provides no insight into the matter-antimatter asymmetry of the universe. Furthermore, it requires the somewhat arbitrary assignment of numerous parameters, hinting at a deeper, more fundamental theory awaiting discovery. Consequently, physicists are actively pursuing “new physics” beyond the Standard Model – exploring hypothetical particles and interactions that could resolve these outstanding puzzles and provide a more complete understanding of the cosmos. These investigations range from searching for supersymmetric partners to exotic new forces and extra spatial dimensions, all driven by the belief that the current model is merely an effective theory – a successful approximation of a more comprehensive reality.
The pursuit of physics beyond the Standard Model frequently centers on meticulously examining experimental results for discrepancies from established predictions. Researchers don’t simply seek confirmation of existing theory; instead, they actively probe for anomalies – subtle deviations in particle behavior or unexpected energy signatures – that could hint at previously unknown forces or particles. This investigative approach extends to searching for particles that the Standard Model doesn’t accommodate, such as sterile neutrinos or components of dark matter. High-energy colliders, like the Large Hadron Collider, and sensitive detectors buried deep underground are employed to recreate conditions similar to the early universe and observe fleeting interactions, hoping to capture evidence of these elusive phenomena and thereby expand the known boundaries of fundamental physics.
Echoes of Creation: The Large Hadron Collider
The Large Hadron Collider (LHC) at CERN operates by accelerating two beams of protons to energies of up to 6.8 TeV per proton, and then colliding them head-on. These proton-proton collisions, occurring at a center-of-mass energy of 13 TeV, recreate conditions similar to those shortly after the Big Bang. The resulting high energies allow for the potential production of massive particles not normally observed in everyday occurrences, as described by E=mc^2. The frequency of these collisions is substantial, with the LHC delivering an integrated luminosity of 138 fb-1 during Run 2, enabling a large dataset for the observation of rare processes and the statistical verification of new particle candidates.
The Compact Muon Solenoid (CMS) is a general-purpose detector at the Large Hadron Collider, constructed to identify and measure the momentum, energy, and type of particles produced in proton-proton collisions. Its design incorporates multiple layers of detectors including a silicon tracker, electromagnetic and hadronic calorimeters, and a muon system, each optimized for specific particle identification. The silicon tracker, positioned close to the interaction point, provides high-resolution measurements of charged particle trajectories. Calorimeters measure the energy deposited by particles, distinguishing between electromagnetic and hadronic interactions. Finally, the muon system, utilizing a strong magnetic field and dedicated detectors, identifies and measures the momentum of muons, which penetrate the calorimeters. This layered approach allows CMS to reconstruct the full event topology and precisely characterize the decay products of potential new particles.
The CMS experiment at the LHC employs a diverse set of search strategies to explore physics beyond the Standard Model. These searches are designed to identify specific theoretical particles and their predicted decay modes within the data collected during Run 2, which comprises an integrated luminosity of 138 fb-1. Analysis techniques target a wide range of models, including supersymmetry, extra dimensions, and dark matter candidates, by examining final state particles – such as photons, leptons, jets, and missing transverse energy – and reconstructing invariant masses and other kinematic variables to identify potential signal excesses above expected background levels. The high luminosity allows for sensitive probes of rare decay processes and enhanced statistical power in the search for new phenomena.

Discerning Whispers from Noise
Searches for compressed supersymmetric (SUSY) models present a unique challenge due to the small mass differences between supersymmetric particles and their Standard Model counterparts, resulting in low-momentum decay products. To address this, analyses employ ‘soft lepton’ strategies which focus on identifying leptons – electrons and muons – with reduced transverse momentum and energy. These leptons arise from the decays of weakly interacting particles produced in SUSY events. Traditional lepton identification techniques are often ineffective for these low-momentum particles due to detector resolution limitations and increased backgrounds. Consequently, specialized reconstruction and identification algorithms are implemented to enhance the sensitivity of these searches, allowing for the exploration of compressed SUSY parameter spaces that would otherwise be inaccessible.
TauNet is a deep neural network developed to improve the identification of tau leptons in high-energy physics analyses, particularly resonant searches. Traditional tau identification techniques struggle with the complex decay modes and frequent overlap with jet activity, leading to significant background contamination. TauNet addresses this by utilizing a fully-connected neural network architecture trained on simulated and observed tau decays. The network operates on high-granularity calorimeter information, allowing for a more precise reconstruction of the tau’s decay products and a subsequent reduction in misidentification rates. This improved identification efficiency directly translates to enhanced sensitivity in searches for physics beyond the Standard Model, where tau leptons often serve as key signatures of new particles.
The Hadron-Plus-Strips (HPS) algorithm addresses the challenge of reconstructing hadronic tau decays, which frequently occur as a key signature in ττ resonance searches. This algorithm utilizes information from both charged tracks and neutral calorimeter cells to identify tau candidates. It reconstructs the tau momentum by combining the four-vectors of charged tracks associated with the decay, and then adds “strips” of energy deposited in the calorimeter that are not directly associated with any charged track. These strips account for neutral particles, like neutral pions and kaons, produced in the tau decay, resulting in a more accurate momentum estimate and improved reconstruction efficiency, thereby increasing the sensitivity of the resonance search.
Scalar leptoquark searches utilize Boosted Decision Trees (BDT) as a multivariate analysis technique to differentiate between signal events – those originating from leptoquark production – and background events arising from Standard Model processes. This discrimination is achieved by training the BDT on a set of kinematic variables optimized to maximize separation between the two event types. The resulting analysis has enabled the exclusion of scalar leptoquarks with masses up to 5 TeV, specifically for parameter spaces exhibiting large coupling strengths. The sensitivity of this search is directly related to the BDT’s ability to effectively reduce background contamination, allowing for the observation of potential signal excesses at higher mass ranges.
Analysis of Higgs boson decays has established an upper limit on the branching ratio for decays to two same-sign leptons (H→SS) of less than 10^{-5} at a proper lifetime of approximately 1 mm (c\tau \approx 1 \text{ mm}). This limit is determined through searches for displaced lepton pairs originating from the Higgs boson decay vertex. The sensitivity of this search is constrained by the detector resolution and the reconstruction efficiency of the displaced vertices, as well as the irreducible backgrounds from prompt lepton production and misidentified leptons. This stringent upper bound significantly constrains models predicting enhanced Higgs boson decays into charged leptons.

The Limits of Perception
The pursuit of new physics at the Large Hadron Collider hinges critically on a precise understanding of the ‘standard’ processes that constitute the background ‘noise’ against which potential signals emerge. Processes like Drell-Yan, where a quark and antiquark collide to produce a lepton pair, and top-quark production, though well-established, are incredibly complex to model accurately. Subtle discrepancies between theoretical predictions and experimental observations in these background processes can easily masquerade as evidence for new particles or phenomena. Consequently, physicists dedicate substantial effort to refining the simulations of these processes, incorporating higher-order quantum corrections and carefully accounting for the intricacies of particle interactions. This meticulous work is not merely a technical detail; it is foundational to interpreting search results and ensuring that any claimed discovery is genuinely a signal of new physics, rather than an artifact of imperfect modeling. Without this precision, the search for what lies beyond the Standard Model remains a blurry and uncertain endeavor.
The pursuit of physics beyond the Standard Model often requires probing increasingly subtle signals obscured by overwhelming backgrounds. To address this challenge, researchers have pioneered the use of “Scouting Data,” a purposefully reduced-information data stream designed to trigger on potentially interesting events before full event reconstruction. By foregoing detailed measurements of all particles, Scouting Data allows for a higher trigger rate, effectively expanding the search reach to lower-mass regions previously inaccessible. This technique is particularly effective in identifying rare processes that might otherwise be missed, as it prioritizes capturing a larger volume of potentially signal-rich events. The reduced data size also facilitates faster analysis and allows for real-time processing, enabling searches to be conducted with greater efficiency and sensitivity.
The impending upgrade to the High-Luminosity Large Hadron Collider promises a substantial leap in data acquisition, directly enhancing the precision and scope of searches for new physics. This increase in luminosity – a measure of collision rate – will allow physicists to probe deeper into rare processes and subtle signals currently hidden within the standard model background. Greater statistical power will enable more definitive tests of theoretical predictions, and the potential discovery of particles and interactions beyond those presently known. By accumulating a significantly larger dataset, researchers anticipate refining existing limits on hypothetical particles, such as those posited by supersymmetry or models with extra dimensions, and potentially revealing entirely new phenomena that reshape understanding of the fundamental constituents of the universe.
Recent advancements in suppressing the overwhelming QCD multijet background – achieving a remarkable 96% reduction – have enabled precision searches for rare Standard Model processes and potential new physics. This improved signal clarity has allowed researchers to establish upper limits on the production cross-section for pp \rightarrow \phi \rightarrow \tau\tau of 10 picobarns within the 20-60 GeV mass range, offering stringent constraints on potential resonant production of this final state. Furthermore, these analyses have placed tight bounds on the branching ratio for Higgs boson decay into two axion-like particles (ALPs), subsequently decaying into four electrons, constraining this potential exotic decay channel to between 10^{-5} and 10^{-6} depending on the mass of the ALP. These results demonstrate the power of advanced analysis techniques in probing beyond the Standard Model and highlight the increasing sensitivity of collider experiments.
Recent analyses have extended the exclusion limits for Higgsino masses to 140 GeV, a significant advancement that effectively bridges a longstanding gap in the search for these hypothetical particles. This result, achieved through meticulous examination of collision data, surpasses the limitations previously imposed by the Large Electron-Positron Collider (LEP). The increased energy and luminosity of the current collider, combined with sophisticated data analysis techniques, have allowed researchers to probe higher mass ranges, pushing the boundaries of known physics and narrowing the parameter space for supersymmetry models. This achievement not only strengthens constraints on theoretical models predicting the existence of Higgsinos, but also demonstrates the continued power of collider experiments to unravel the mysteries of the universe and search for physics beyond the Standard Model.

The pursuit of new physics, as diligently undertaken by the CMS experiment, resembles a complex simulation striving to map the unseeable. Each search for deviations from the Standard Model, meticulously examining leptonic decay channels, is an attempt to resolve the blurry edges of reality. Yet, despite the sophistication of these analyses, the findings remain consistent with established theory. As Jürgen Habermas observed, “The only way to learn is to ask questions.” This relentless questioning, even when yielding null results, serves not to invalidate the process, but to refine the parameters of inquiry, pushing the boundaries of what is known and, ultimately, revealing more about the limitations of current understanding. The absence of evidence, it seems, is a form of evidence itself.
What Lies Beyond the Horizon?
The continued absence of statistically significant deviations in lepton final states, as documented by the CMS experiment, presents a curious situation. It is not necessarily a testament to the completeness of the Standard Model, but rather a stark reminder of the limitations inherent in any attempt to map the universe through observation. The search continues, naturally, with increased luminosity and refined analysis techniques, yet the very act of seeking implies a preconceived notion of what is to be found. Modeling increasingly complex scenarios – supersymmetry, extra dimensions, exotic Higgs decays – requires consideration of not only the observed data, but also the biases embedded within the theoretical frameworks themselves.
The observed limits on various beyond-the-Standard-Model scenarios are, in a sense, more informative than any discovery could be. They define the boundaries of current understanding, the point beyond which existing theories falter. The accretion disk of theoretical possibility is narrowing, and the spectral line variations hint at complexities yet to be fully resolved. Each null result is not a failure, but a recalibration, a humbling acknowledgement that the universe may not conform to the elegance of human imagination.
Future investigations will undoubtedly explore increasingly subtle decay channels and leverage advancements in machine learning to identify faint signals. However, it is prudent to remember that even the most sophisticated analysis is predicated on assumptions about the underlying physics. The true horizon may not be defined by a lack of data, but by the limits of the questions asked.
Original article: https://arxiv.org/pdf/2603.04150.pdf
Contact the author: https://www.linkedin.com/in/avetisyan/
See also:
- Survivor’s Colby Donaldson Admits He Almost Backed Out of Season 50
- Where Winds Meet: How To Defeat Shadow Puppeteer (Boss Guide)
- Gold Rate Forecast
- Best Controller Settings for ARC Raiders
- How to Build a Waterfall in Enshrouded
- The 10 Best Episodes Of Star Trek: Enterprise
- Resident Evil Requiem cast: Full list of voice actors
- Best Thanos Comics (September 2025)
- Best Shazam Comics (Updated: September 2025)
- 10 Most Iconic Comic Book Resurrections From Jean Grey to Superman
2026-03-05 09:15