Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (170)

Search Parameters:
Keywords = LHC energies

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
11 pages, 5740 KB  
Review
Recent Progress in Antimatter Research with Heavy-Ion Collisions
by Tan Lu, Junlin Wu and Hao Qiu
Symmetry 2026, 18(4), 693; https://doi.org/10.3390/sym18040693 - 21 Apr 2026
Viewed by 270
Abstract
Matter–antimatter asymmetry is a fundamental question in both astronomy and particle physics. Investigating antimatter is of great interest for testing the potential explanations of matter–antimatter asymmetry in our Universe. In relativistic heavy-ion collisions, the extremely high energy density and temperature are similar to [...] Read more.
Matter–antimatter asymmetry is a fundamental question in both astronomy and particle physics. Investigating antimatter is of great interest for testing the potential explanations of matter–antimatter asymmetry in our Universe. In relativistic heavy-ion collisions, the extremely high energy density and temperature are similar to the early Universe shortly after the Big Bang. In this paper, we review the recent progress in antimatter search and study heavy-ion collisions, with a focus on the RHIC-STAR and LHC-ALICE experiments, particularly the newly observed antimatter hypernuclei H¯Λ¯4 and He¯Λ¯4. The statistical thermal model and the coalescence production model can quantitatively describe the production yields and yield ratios, and the yield measurements of H¯Λ¯4, He¯Λ¯4 and their matter counterparts indicate the existence of spin-excited states of these (anti)hypernuclei. Furthermore, new measurements of the lifetimes of H¯Λ¯3, H¯Λ¯4 and their matter counterparts reveal no difference between a particle and its corresponding antiparticle, which validates the CPT theorem. Full article
Show Figures

Figure 1

16 pages, 1911 KB  
Article
Development of 28 nm CMOS Front-End Channels for the Readout of Hybrid Pixel Sensors in Future Colliders and Photon Science Applications
by Luigi Gaioni, Simone Gerardin, Valerio Re and Gianluca Traversi
Electronics 2026, 15(8), 1641; https://doi.org/10.3390/electronics15081641 - 14 Apr 2026
Viewed by 373
Abstract
This paper describes two front-end architectures developed in a 28 nm CMOS process for the readout of pixel detectors in future high-energy physics (HEP) colliders and advanced X-ray imaging instrumentation. The front-end channels have been developed in the framework of the PiHEX project, [...] Read more.
This paper describes two front-end architectures developed in a 28 nm CMOS process for the readout of pixel detectors in future high-energy physics (HEP) colliders and advanced X-ray imaging instrumentation. The front-end channels have been developed in the framework of the PiHEX project, funded by the Italian Ministry of University and Research. PiHEX aims to improve the state of the art of pixel readout chip technology in high-luminosity colliders and X-ray imagers in the next generation of free electron lasers (FELs) by developing, in 28 nm CMOS technology, the fundamental microelectronic building blocks for pixel readout chips. Such blocks, also implementing innovative circuit ideas, will enable, in future applications, the integration of large-scale readout chips, meeting a set of challenging requirements, such as high spatial resolution, high signal-to-noise ratio, very wide dynamic range and the capability to withstand unprecedented radiation levels. Two different front-end channels were designed, integrated into two prototype chips, and tested. One architecture, featuring a pixel size of 25 µm × 100 µm, was optimized for tracking applications in high-energy physics experiments, like the ones that take place at CERN in the high-luminosity upgrade of the Large Hadron Collider (LHC), while the second one, featuring a pixel size of 110 µm × 55 µm, was devised for X-ray imaging applications in FELs. Full article
(This article belongs to the Special Issue New Trends in CMOS: Devices, Technologies, and Applications)
Show Figures

Figure 1

13 pages, 4616 KB  
Review
Current Status and Future Prospects of the LHCf Experiment
by Oscar Adriani, Eugenio Berti, Pietro Betti, Lorenzo Bonechi, Massimo Bongi, Raffaello D’Alessandro, Sebastiano Detti, Elena Gensini, Elena Geraci, Maurice Haguenauer, Vlera Hajdini, Cigdem Issever, Yoshitaka Itow, Katsuaki Kasahara, Haruka Kobayashi, Clara Leitgeb, Yutaka Matsubara, Hiroaki Menjo, Yasushi Muraki, Andrea Paccagnella, Paolo Papini, Giuseppe Piparo, Sergio Bruno Ricciarini, Takashi Sako, Nobuyuki Sakurai, Monica Scaringella, Yuki Shimizu, Tadashi Tamura, Alessio Tiberio, Shoji Torii, Alessia Tricomi, Bill Turner and Kenji Yoshidaadd Show full author list remove Hide full author list
Particles 2026, 9(2), 34; https://doi.org/10.3390/particles9020034 - 2 Apr 2026
Viewed by 315
Abstract
The Large Hadron Collider forward (LHCf) experiment studies the production of neutral particles in the very forward region of high-energy hadronic collisions at the LHC. These measurements provide essential calibration data for hadronic interaction models used in simulations of extensive air showers initiated [...] Read more.
The Large Hadron Collider forward (LHCf) experiment studies the production of neutral particles in the very forward region of high-energy hadronic collisions at the LHC. These measurements provide essential calibration data for hadronic interaction models used in simulations of extensive air showers initiated by ultra-high-energy cosmic rays. The LHCf experiment measures forward-produced neutral particles, such as neutrons, photons, π0, and η mesons, which play a key role in the development of extensive air showers. Proton–proton collisions at the LHC reach center-of-mass energies up to 13.6 TeV, corresponding in the fixed-target frame to cosmic-ray interactions at energies close to 1017 eV in the Earth’s atmosphere. LHCf has collected data in proton–proton collisions at several energies, as well as in proton–lead collisions, enabling detailed comparisons between experimental results and predictions of hadronic interaction models. This contribution reviews the most significant LHCf results, with emphasis on Run II proton–proton data at s=13TeV, including measurements of forward neutron, photon, and η meson production. Finally, future prospects are discussed, focusing on ongoing analyses of Run III proton–proton data at s=13.6TeV and on the final LHCf operation in proton-oxygen collisions at sNN=9.6TeV, which best reproduces cosmic-ray interactions with nuclei of the Earth’s atmosphere. Full article
Show Figures

Figure 1

24 pages, 6500 KB  
Article
Integrated Analysis of Physiological and Transcriptional Mechanisms in Response to Drought Stress in Scaevola taccada Seedlings
by Yaqin Wang, Wenlan Liu, Cunwu Zuo, Yongzhong Luo and Mengting Huang
Plants 2026, 15(6), 970; https://doi.org/10.3390/plants15060970 - 21 Mar 2026
Viewed by 499
Abstract
Scaevola taccada, as a key dominant plant in coastal ecosystems, plays an irreplaceable role in sand fixation, shoreline protection, and maintaining the ecological stability of coastal zones. To investigate the effects of drought stress on the Binghai plant Scaevola taccada seedlings, a [...] Read more.
Scaevola taccada, as a key dominant plant in coastal ecosystems, plays an irreplaceable role in sand fixation, shoreline protection, and maintaining the ecological stability of coastal zones. To investigate the effects of drought stress on the Binghai plant Scaevola taccada seedlings, a natural drought treatment was applied. Physiological indicators were measured at 0, 10, 25, and 40 days of stress, and 5 days after rewatering. Transcriptome sequencing and long non-coding RNA (lncRNA) analysis were also conducted to reveal the drought response mechanisms and molecular regulatory networks. The results showed that: (1) Prolonged drought significantly inhibited growth, with relative height increase, leaf number, and relative water content declining by 46.8%, 37.2%, and 63.4%, respectively, at T40 compared to the control. (2) In terms of photosynthetic physiology, Rubisco activity, RCA activity, SPAD value, Fv/Fm, and qP all continuously declined with increasing stress, while NPQ increased, suggesting damage to the photosynthetic system but also the activation of energy dissipation mechanisms to alleviate photooxidative stress. (3) The antioxidant system played a crucial role in the drought response. Under drought stress, the activities of SOD, POD, and CAT, and MDA content, underwent significant changes, with antioxidant enzyme activities rebounding notably after rewatering. (4) Transcriptome analysis revealed that differentially expressed mRNAs and lncRNA-targeted genes were significantly enriched in the ‘photosynthesis’ and ‘carbon metabolism’ pathways. Key genes involved, including PSAD-1, PSAL, NPQ4, six LHCs, BAM3, BAM1, SSII-A, and FRK1, were identified as core components of the regulatory network. In summary, Scaevola taccada effectively responds to drought stress through multi-level mechanisms, including photosynthetic regulation, carbon metabolism regulation, antioxidant defense, and transcriptional reprogramming, demonstrating strong drought resistance and post-rewatering recovery potential. These findings provide scientific evidence for plant selection and application in ecological restoration projects in coastal areas in the context of global climate extremes. Full article
(This article belongs to the Section Plant Physiology and Metabolism)
Show Figures

Figure 1

53 pages, 1976 KB  
Review
Fully Heavy Pentaquarks with Jethad: A High-Energy Viewpoint
by Francesco Giovanni Celiberto
Particles 2026, 9(1), 23; https://doi.org/10.3390/particles9010023 - 3 Mar 2026
Viewed by 684
Abstract
We examine the leading-power fragmentation of fully heavy pentaquarks in high-energy hadronic collisions. To this end, we complete the release of the hadron structure-oriented PQ5Q1.0 fragmentation functions by discussing the P5c set and delivering the P5b one. These functions [...] Read more.
We examine the leading-power fragmentation of fully heavy pentaquarks in high-energy hadronic collisions. To this end, we complete the release of the hadron structure-oriented PQ5Q1.0 fragmentation functions by discussing the P5c set and delivering the P5b one. These functions incorporate an improved computation of the initial-scale input for the constituent heavy-quark fragmentation channel, making them particularly suitable for describing both the direct formation of a compact multicharm state and the hadronization from a diquark–antiquark–diquark configuration. For phenomenological applications, we employ the data-validated (sym)Jethad framework to compute and analyze NLL/NLO+ semi-inclusive production rates of pentaquark-plus-jet systems at the upcoming HL-LHC and the future FCC. This study marks a further step toward connecting hadronic structure, precision QCD, and the emerging physics of exotic matter. Full article
Show Figures

Figure 1

29 pages, 16526 KB  
Article
Enhanced Optimization-Based PV Hosting Capacity Method for Improved Planning of Real Distribution Networks
by Jairo Blanco-Solano, Diego José Chacón Molina and Diana Liseth Chaustre Cárdenas
Electricity 2026, 7(1), 12; https://doi.org/10.3390/electricity7010012 - 2 Feb 2026
Viewed by 631
Abstract
This paper presents an optimization-based method to support distribution system operators (DSOs) in planning large-scale photovoltaic (PV) integration at the medium-voltage (MV) level. The PV hosting capacity (PV-HC) problem is formulated as a mixed-integer quadratically constrained program (MIQCP) without linearizing approximations to determine [...] Read more.
This paper presents an optimization-based method to support distribution system operators (DSOs) in planning large-scale photovoltaic (PV) integration at the medium-voltage (MV) level. The PV hosting capacity (PV-HC) problem is formulated as a mixed-integer quadratically constrained program (MIQCP) without linearizing approximations to determine PV sizes and locations while enforcing operating limits and planning constraints, including candidate PV locations, per-unit PV capacity limits, active power exchange with the upstream grid, and PV power factor. Our method defines two HC solution classes: (i) sparse solutions, which allocate the PV capacity to a limited subset of candidate nodes, and (ii) non-sparse solutions, which are derived from locational hosting capacity (LHC) computations at all candidate nodes, and are then aggregated into conservative zonal HC values. The approach is implemented in a Hosting Capacity–Distribution Planning Tool (HC-DPT) composed of a Python–AMPL optimization environment and a Python–OpenDSS probabilistic evaluation environment. The worst-case operating conditions are obtained from probabilistic models of demand and solar irradiance, and Monte Carlo simulations quantify the performance under uncertainty over a representative daily window. To support integrated assessment, the index Gexp is introduced to jointly evaluate exported energy and changes in local distribution losses, enabling a system-level interpretation beyond loss variations alone. A strategy was also proposed to derive worst-case scenarios from zonal HC solutions to bound performance metrics across multiple PV integration schemes. Results from a real MV case study show that PV location policies, export constraints, and zonal HC definitions drive differences in losses, exported energy, and solution quality while maintaining computation times compatible with DSO planning workflows. Full article
Show Figures

Figure 1

51 pages, 20151 KB  
Review
Tetraquark-Jet Systems at the High-Luminosity LHC
by Francesco Giovanni Celiberto
Universe 2026, 12(1), 13; https://doi.org/10.3390/universe12010013 - 2 Jan 2026
Cited by 1 | Viewed by 839
Abstract
We investigate the high-energy production of tetraquark-jet systems at the LHC and its forthcoming high-luminosity upgrade. In this review, we examine the leading–power fragmentation of fully heavy tetraquarks (T4Q) in hadronic collisions, highlighting their relevance as novel probes of [...] Read more.
We investigate the high-energy production of tetraquark-jet systems at the LHC and its forthcoming high-luminosity upgrade. In this review, we examine the leading–power fragmentation of fully heavy tetraquarks (T4Q) in hadronic collisions, highlighting their relevance as novel probes of multiquark dynamics in QCD. Our analysis relies on the hadron–structure–oriented TQ4Q1.1 fragmentation functions, built within a nonrelativistic QCD framework that incorporates both gluon- and heavy-quark-initiated channels. Threshold-consistent DGLAP evolution is performed through the HF-NRevo scheme, enabling a unified treatment of mass thresholds and scale variations. We also provide a systematic discussion of uncertainties arising from color-composite long-distance matrix elements (LDMEs) and from perturbative hard- and fragmentation-scale inputs (H- and F-MHOUs). Phenomenological predictions are obtained using the (sym)Jethad framework at NLL/NLO+ accuracy for semi-inclusive tetraquark-jet production at the LHC and beyond. This review connects the emerging spectroscopy of fully heavy exotics with modern fragmentation-based approaches to hadron structure and high-energy QCD. Full article
(This article belongs to the Section High Energy Nuclear and Particle Physics)
Show Figures

Figure 1

32 pages, 13923 KB  
Article
Design of a Hermetic Centrifugal Pump Impeller Using RSM and Evolutionary Algorithms with Application of SLS Technology
by Viorel Bostan, Andrei Petco, Dmitrii Croitor, Nadejda Proca and Vadim Zubac
Processes 2026, 14(1), 152; https://doi.org/10.3390/pr14010152 - 1 Jan 2026
Cited by 1 | Viewed by 939
Abstract
This study presents the development and validation of a comprehensive numerical optimisation methodology used to improve the energy efficiency of a pump with normal characteristics: volume flow rate, Q nom = 6.3 m3/h, and head, H = 20 mH2O. [...] Read more.
This study presents the development and validation of a comprehensive numerical optimisation methodology used to improve the energy efficiency of a pump with normal characteristics: volume flow rate, Q nom = 6.3 m3/h, and head, H = 20 mH2O. The methodology was implemented in ANSYS Workbench using ANSYS CFX and optiSLang. The optimisation process is based on data from 853 RANS (SST) calculations on a sample generated by the LHC method, varying the parameters of the blades and flow path. Response surfaces (RSM) were constructed using anisotropic and classical kriging, which were optimised using an Evolutionary Algorithm (EA). The optimised geometry was verified numerically by URANS SST and experimentally. For physical validation, the wheel was manufactured using SLS technology from PA-12 Industrial powder, a strength assessment FSI was performed, and the geometry was checked by 3D scanning. 3D scanning showed a high manufacturing accuracy (deviations of 0.1–0.3 mm). The result is a geometry that increases efficiency while maintaining head, which has been confirmed by experimental validation. Full article
(This article belongs to the Section AI-Enabled Process Engineering)
Show Figures

Figure 1

49 pages, 1451 KB  
Review
Triply Heavy Ω Baryons with Jethad: A High-Energy Viewpoint
by Francesco Giovanni Celiberto
Symmetry 2026, 18(1), 29; https://doi.org/10.3390/sym18010029 - 23 Dec 2025
Cited by 2 | Viewed by 646
Abstract
We investigate the leading-power fragmentation of triply heavy Ω baryons in high-energy hadronic collisions. Extending our previous work on the Ω3c sector, we release the full OMG3Q1.0 family of collinear fragmentation functions by completing the description of the charm channel and [...] Read more.
We investigate the leading-power fragmentation of triply heavy Ω baryons in high-energy hadronic collisions. Extending our previous work on the Ω3c sector, we release the full OMG3Q1.0 family of collinear fragmentation functions by completing the description of the charm channel and delivering novel Ω3b functions. These hadron-structure-oriented functions are constructed from improved proxy-model calculations for heavy-quark and gluon fragmentation, matched to a flavor-aware DGLAP evolution based on the HF-NRevo scheme. For phenomenological applications, we employ the (sym)Jethad multimodular interface to compute and analyze NLL/NLO+ semi-inclusive Ω3Q plus jet distributions at the HL-LHC and FCC. This work consolidates the link between hadron structure, rare baryon production, and resummed QCD at the energy frontier. Full article
(This article belongs to the Special Issue Symmetry and Quantum Chromodynamics)
Show Figures

Figure 1

69 pages, 84358 KB  
Review
Advances and Prospects of Lignin-Derived Hard Carbons for Next-Generation Sodium-Ion Batteries
by Narasimharao Kitchamsetti and Sungwook Mhin
Polymers 2025, 17(20), 2801; https://doi.org/10.3390/polym17202801 - 20 Oct 2025
Cited by 12 | Viewed by 3183
Abstract
Lignin-derived hard carbon (LHC) has emerged as a highly promising anode material for sodium-ion batteries (SIBs), owing to its renewable nature, structural tunability, and notable electrochemical properties. Although considerable advancements have been made in the development of LHCs in recent years, the absence [...] Read more.
Lignin-derived hard carbon (LHC) has emerged as a highly promising anode material for sodium-ion batteries (SIBs), owing to its renewable nature, structural tunability, and notable electrochemical properties. Although considerable advancements have been made in the development of LHCs in recent years, the absence of a comprehensive and critical review continues to impede further innovation in the field. To address this deficiency, the present review begins by examining the intrinsic characteristics of lignin and hard carbon (HC) to elucidate the underlying mechanisms of LHC microstructure formation. It then systematically categorizes the synthesis strategies, structural attributes, and performance influences of various LHCs, focusing particularly on how feedstock characteristics and fabrication parameters dictate final material behavior. Furthermore, optimization methodologies such as feedstock pretreatment, controlled processing, and post-synthesis modifications are explored in detail to provide a practical framework for performance enhancement. Finally, informed recommendations and future research directions are proposed to facilitate the integration of LHCs into next-generation SIB systems. This review aspires to deepen scientific understanding and guide rational design for improved LHC applications in energy storage. Full article
(This article belongs to the Special Issue Advances in Polymer Applied in Batteries and Capacitors, 2nd Edition)
Show Figures

Figure 1

13 pages, 2995 KB  
Article
Gluon Condensation as a Unifying Mechanism for Special Spectra of Cosmic Gamma Rays and Low-Momentum Pion Enhancement at the Large Hadron Collider
by Wei Zhu, Jianhong Ruan, Xurong Chen and Yuchen Tang
Symmetry 2025, 17(10), 1664; https://doi.org/10.3390/sym17101664 - 6 Oct 2025
Viewed by 806
Abstract
Gluons within the proton may accumulate near a critical momentum due to nonlinear QCD effects, leading to a gluon condensation. Surprisingly, the pion distribution predicted by this gluon distribution could answer two puzzles in astronomy and high-energy physics. During ultra-high-energy cosmic ray collisions, [...] Read more.
Gluons within the proton may accumulate near a critical momentum due to nonlinear QCD effects, leading to a gluon condensation. Surprisingly, the pion distribution predicted by this gluon distribution could answer two puzzles in astronomy and high-energy physics. During ultra-high-energy cosmic ray collisions, gluon condensation may abruptly produce a large number of low-momentum pions, whose electromagnetic decays have the typical broken power law. On the other hand, the Large Hadron Collider (LHC) shows weak but recognizable signs of gluon condensation, which had been mistaken for BEC pions. Symmetry is one of the fundamental laws in natural phenomena. Conservation of energy stems from time symmetry, which is one of the most central principles in nature. In this study, we reveal that the connection between the above two apparently unrelated phenomena can be fundamentally explained from the fundamental principle of conservation of energy, highlighting the deep connection and unifying role symmetry plays in physical processes. Full article
(This article belongs to the Section Physics)
Show Figures

Figure 1

11 pages, 1464 KB  
Article
Effects of Polymerization Initiators on Plastic Scintillator Light Output
by Mustafa Kandemir and Bora Akgün
Instruments 2025, 9(3), 19; https://doi.org/10.3390/instruments9030019 - 22 Aug 2025
Cited by 1 | Viewed by 1709
Abstract
Polymerization initiators are commonly used to lower the processing temperatures and accelerate the synthesis of plastic scintillators. However, these additives can reduce light output. Since plastic scintillator tiles, fibers, and bars are used in countless radiation detection instruments, from PET scanners to LHC [...] Read more.
Polymerization initiators are commonly used to lower the processing temperatures and accelerate the synthesis of plastic scintillators. However, these additives can reduce light output. Since plastic scintillator tiles, fibers, and bars are used in countless radiation detection instruments, from PET scanners to LHC calorimeters, any loss in light output immediately degrades the timing and energy resolution of the whole system. Understanding how the initiators alter scintillation performance is therefore important. In this study, five different plastic scintillator samples were produced with varying concentrations of two initiators, 2,2-Azobis(2-methylpropionitrile) (AIBN) and benzoyl peroxide (BPO), along with a reference sample containing no initiators. The relative light yield (RLY) was measured using four different gamma sources. Analyzing the Compton edges revealed that higher initiator concentrations consistently decrease the light output. This study shows that keeping the initiator concentration at 0.2% limits the reduction to 8%, whereas 0.5–1% loadings can lower the yield by 20–35%, providing realistic bounds on initiator levels for future plastic scintillator productions. Full article
Show Figures

Figure 1

18 pages, 1981 KB  
Article
Enrichment of the HEPscore Benchmark by Energy Consumption Assessment
by Taras V. Panchenko and Nikita D. Piatygorskiy
Technologies 2025, 13(8), 362; https://doi.org/10.3390/technologies13080362 - 15 Aug 2025
Cited by 1 | Viewed by 1049
Abstract
The HEPscore benchmark, widely used for evaluating computational performance in high-energy physics, has been identified as requiring energy consumption metrics to address the increasing importance of energy efficiency in large-scale computing infrastructures. This study introduces an energy measurement extension for HEPscore, designed to [...] Read more.
The HEPscore benchmark, widely used for evaluating computational performance in high-energy physics, has been identified as requiring energy consumption metrics to address the increasing importance of energy efficiency in large-scale computing infrastructures. This study introduces an energy measurement extension for HEPscore, designed to operate across diverse hardware platforms without requiring administrative privileges or physical modifications. The extension utilizes the Running Average Power Limit (RAPL) interface available in modern processors and dynamically selects the most suitable measurement method based on system capabilities. When RAPL access is unavailable, the system automatically switches to alternative measurement approaches. To validate the accuracy of the software-based measurements, external hardware monitoring devices were used to collect reference data directly from the power supply circuit. Obtained results demonstrate a significant correlation across multiple test platforms running standard HEP workloads. The developed extension integrates energy consumption data into standard HEPscore reports, enabling the calculation of energy efficiency metrics such as HEPscore/Watt. This implementation meets the requirements of the HEPiX Benchmarking Working Group, providing a reliable and portable solution for quantifying energy efficiency alongside computational performance. The proposed method supports informed decision making in resource planning and hardware acquisition for HEP computing environments. Full article
(This article belongs to the Section Information and Communication Technologies)
Show Figures

Figure 1

26 pages, 4856 KB  
Article
PREFACE: A Search for Long-Lived Particles at the Large Hadron Collider
by Burak Hacisahinoglu, Suat Ozkorucuklu, Maksym Ovchynnikov, Michael G. Albrow, Aldo Penzo and Orhan Aydilek
Physics 2025, 7(3), 33; https://doi.org/10.3390/physics7030033 - 1 Aug 2025
Viewed by 2101
Abstract
The Standard Model (SM) fails to explain many problems (neutrino masses, dark matter, and matter–antimatter asymmetry, among others) that may be resolved with new particles beyond the SM. No observation of such new particles may be explained either by their exceptionally high mass [...] Read more.
The Standard Model (SM) fails to explain many problems (neutrino masses, dark matter, and matter–antimatter asymmetry, among others) that may be resolved with new particles beyond the SM. No observation of such new particles may be explained either by their exceptionally high mass or by considerably small coupling to SM particles. The latter case implies relatively long lifetimes. Such long-lived particles (LLPs) then to have signatures different from those of SM particles. Searches in the “central region” are covered by the LHC general purpose experiments. The forward small angle region far from the interaction point (IP) is unexplored. Such particles are expected to have the energy as large as E = O(1 TeV) and Lorentz time dilation factor γ=E/m102103 (with m the particle mass) hence long enough decay distances. A new class of specialized LHC detectors dedicated to LLP searches has been proposed for the forward regions. Among these experiments, FASER is already operational, and FACET is under consideration at a location 100 m from the LHC IP5 (the CMS detector intersection). However, some features of FACET require a specially enlarged beam pipe, which cannot be implemented for LHC Run 4. In this study, we explore a simplified version of the proposed detector PREFACE compatible with the standard LHC beam pipe in the HL-LHC Run 4. Realistic Geant4 simulations are performed and the background is evaluated. An initial analysis of the physics potential with the PREFACE geometry indicates that several significant channels could be accessible with sensitivities comparable to FACET and other LLP searches. Full article
(This article belongs to the Section High Energy Physics)
Show Figures

Figure 1

26 pages, 491 KB  
Article
Remarkable Scale Relation, Approximate SU(5), Fluctuating Lattice
by Holger B. Nielsen
Universe 2025, 11(7), 211; https://doi.org/10.3390/universe11070211 - 26 Jun 2025
Cited by 2 | Viewed by 800
Abstract
In this study, we discuss a series of eight energy scales, some of which are our own speculations, and fit the logarithms of these energies as a straight line versus a quantity related to the dimensionalities of action terms in a way to [...] Read more.
In this study, we discuss a series of eight energy scales, some of which are our own speculations, and fit the logarithms of these energies as a straight line versus a quantity related to the dimensionalities of action terms in a way to be defined in the article. These terms in the action are related to the energy scales in question. So, for example, the dimensionality of the Einstein–Hilbert action coefficient is one related to the Planck scale. In fact, we suppose that, in the cases described with quantum field theory, there is, for each of our energy scales, a pair of associated terms in the Lagrangian density, one “kinetic” and one “mass or current” term. To plot the energy scales, we use the ratio of the dimensionality of, say, the “non-kinetic” term to the dimensionality of the “kinetic” one. For an explanation of our phenomenological finding that the logarithm of the energies depends, as a straight line, on the dimensionality defined integer q, we give an ontological—i.e., it really exists in nature in our model—“fluctuating lattice” with a very broad distribution of, say, the link size a. We take the Gaussian in the logarithm, ln(a). A fluctuating lattice is very natural in a theory with general relativity, since it corresponds to fluctuations in the gauge depth of the field of general relativity. The lowest on our energy scales are intriguing, as they are not described by quantum field theory like the others but by actions for a single particle or single string, respectively. The string scale fits well with hadronic strings, and the particle scale is presumably the mass scale of Standard Model group monopoles, the bound state of a couple of which might be the dimuon resonance (or statistical fluctuation) found in LHC with a mass of 28 GeV. Full article
(This article belongs to the Section High Energy Nuclear and Particle Physics)
Show Figures

Figure 1

Back to TopTop