Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (689)

Search Parameters:
Keywords = large n problem

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
13 pages, 787 KB  
Article
An Exploratory Randomised Trial of a Self-Managed Home-Based Exaggerated Spatial Cueing Intervention for Handwriting in Parkinson’s Disease
by Daria Andreoli, Alex Reed, Shelly Coe, Helen Dawes and Johnny Collett
Disabilities 2025, 5(4), 93; https://doi.org/10.3390/disabilities5040093 - 21 Oct 2025
Viewed by 227
Abstract
Handwriting impairment is a cardinal symptom of Parkinson’s. However, treatment options are limited. Here we evaluate the utility and estimate effects of a novel low-resource handwriting intervention (Clinicaltrials. gov NCT03369587). Forty-eight people with Parkinsons with self-reported handwriting problems were recruited to an exploratory, [...] Read more.
Handwriting impairment is a cardinal symptom of Parkinson’s. However, treatment options are limited. Here we evaluate the utility and estimate effects of a novel low-resource handwriting intervention (Clinicaltrials. gov NCT03369587). Forty-eight people with Parkinsons with self-reported handwriting problems were recruited to an exploratory, assessor-blind two-arm parallel randomized trial to either diverging (n = 24, n = 19 analysed) or parallel (n = 24, n = 20 analysed) groups. Both received a six-week, five times a week, handwriting program: writing a daily diary on lined paper (diverging: 10 mm increasing to 13 mm apart, parallel: 10 mm apart). Outcomes were measures of impairment (cursive ‘el’, single and dual-task), handwriting function (sentence and free writing) and self-reported difficulties. Median diary entries (31, IRQ: 17.5–39) were greater than requested (30) with no differences between groups, p = 0.302. No adverse events were reported. Regardless of group, improvements were found in writing ‘el’ speed (single task: d = −0.90, 95% CI: −1.41: −0.38, p = 0.001; dual task: d = −0.72, 95% CI: −1.24: −0.21, p = 0.09) and amplitude (single task: d = 1.07, 95% CI: 0.49: 1.66, p < 0.001; dual task: d = 0.86, 95% CI: 0.35: 1.37, p = 0.002). Sentence amplitude (d = 0.80, 95% CI: 0.30: 1.29, p = 0.003) and perceived difficulties also improved (OR = −3.6, 95% CI: −12.6: −1.0, p = 0.047). Between-group effects were small (d = 0.11 to 0.48). Large improvements to handwriting, which required less attention, were found after self-directed well-adhered-to practice. Potential additional benefits of exaggerated cueing were small. Full article
Show Figures

Figure 1

20 pages, 363 KB  
Article
A Set of Master Variables for the Two-Star Random Graph
by Pawat Akara-pipattana and Oleg Evnin
Entropy 2025, 27(10), 1081; https://doi.org/10.3390/e27101081 - 19 Oct 2025
Viewed by 172
Abstract
The two-star random graph is the simplest exponential random graph model with nontrivial interactions between the graph edges. We propose a set of auxiliary variables that control the thermodynamic limit where the number of vertices N tends to infinity. Such ’master variables’ are [...] Read more.
The two-star random graph is the simplest exponential random graph model with nontrivial interactions between the graph edges. We propose a set of auxiliary variables that control the thermodynamic limit where the number of vertices N tends to infinity. Such ’master variables’ are usually highly desirable in treatments of ‘large N’ statistical field theory problems. For the dense regime when a finite fraction of all possible edges are filled, this construction recovers the mean-field solution of Park and Newman, but with explicit control over the 1/N corrections. We use this advantage to compute the first subleading correction to the Park–Newman result, which encodes the finite, nonextensive contribution to the free energy. For the sparse regime with a finite mean degree, we obtain a very compact derivation of the Annibale–Courtney solution, originally developed with the use of functional integrals, which is comfortably bypassed in our treatment. Full article
(This article belongs to the Section Statistical Physics)
Show Figures

Figure 1

37 pages, 27740 KB  
Article
A Dynamic Multi-Objective Optimization Algorithm for AGV Routing in Assembly Workshops
by Yong Chen, Yuqi Sun, Mingyu Chen, Wenchao Yi, Zhi Pei and Jiong Li
Appl. Sci. 2025, 15(20), 11076; https://doi.org/10.3390/app152011076 - 16 Oct 2025
Viewed by 407
Abstract
This study tackles the complex challenge of dynamic multi-objective vehicle routing optimization in large-scale equipment manufacturing, where routing operations significantly impact both economic performance and environmental sustainability. We develop an innovative Dynamic Multi-Objective Vehicle Routing Problem (DMOVRP) model that uniquely integrates three competing [...] Read more.
This study tackles the complex challenge of dynamic multi-objective vehicle routing optimization in large-scale equipment manufacturing, where routing operations significantly impact both economic performance and environmental sustainability. We develop an innovative Dynamic Multi-Objective Vehicle Routing Problem (DMOVRP) model that uniquely integrates three competing objectives: environmental impact reduction, delivery timeliness, and operational robustness. The proposed algorithm combines a dynamic event handler with the NSACOWDRL algorithm—an adaptive multi-objective optimization algorithm with dynamic event handling capability. The proposed system features adaptive mechanisms for handling real-time disruptions through specialized event classification and dynamic rescheduling protocols. Extensive computational experiments demonstrate the algorithm’s superior performance with statistically significant improvements using the Wilcoxon signed-rank test (p < 0.05, n = 30 runs per instance), achieving average relative gains of 15.2% in HV, 12.8% in IGD, and 8.9% in GD metrics compared to established methods. This research makes theoretical contributions through its feasibility quantification metric and practical advancements in routing schedule systems. By successfully reconciling traditionally conflicting objectives through dynamic JIT adjustments and robustness-aware optimization, this work provides manufacturers with a versatile decision-support tool that adapts to unpredictable workshop conditions while maintaining sustainable operations. Full article
Show Figures

Figure 1

0 pages, 383 KB  
Article
Scalable Time Series Causal Discovery with Approximate Causal Ordering
by Ziyang Jiao, Ce Guo and Wayne Luk
Mathematics 2025, 13(20), 3288; https://doi.org/10.3390/math13203288 - 14 Oct 2025
Viewed by 317
Abstract
Causal discovery in time series data presents a significant computational challenge. Standard algorithms are often prohibitively expensive for datasets with many variables or samples. This study introduces and validates a heuristic approximation of the VarLiNGAM algorithm to address this scalability problem. The standard [...] Read more.
Causal discovery in time series data presents a significant computational challenge. Standard algorithms are often prohibitively expensive for datasets with many variables or samples. This study introduces and validates a heuristic approximation of the VarLiNGAM algorithm to address this scalability problem. The standard VarLiNGAM method relies on an iterative refinement procedure for causal ordering that is computationally expensive. Our heuristic modifies this procedure by omitting the iterative refinement. This change permits a one-time precomputation of all necessary statistical values. The algorithmic modification reduces the time complexity of VarLiNGAM from O(m3n) to O(m2n+m3) while keeping the space complexity at O(m2), where m is the number of variables and n is the number of samples. While an approximation, our approach retains VarLiNGAM’s essential structure and empirical reliability. On large-scale financial data with up to 400 variables, our algorithm achieves up to a 13.36× speedup over the standard implementation and an approximate 4.5× speedup over a GPU-accelerated version. Evaluations across medical time series analysis, IT service monitoring, and finance demonstrate the heuristic’s robustness and practical scalability. This work offers a validated balance between computational efficiency and discovery quality, making large-scale causal analysis feasible on personal computers. Full article
(This article belongs to the Special Issue Advances in High-Speed Computing and Parallel Algorithm)
Show Figures

Figure 1

17 pages, 2845 KB  
Article
Poisson Mean Homogeneity: Single-Observation Framework with Applications
by Xiaoping Shi, Augustine Wong and Kai Kaletsch
Symmetry 2025, 17(10), 1702; https://doi.org/10.3390/sym17101702 - 10 Oct 2025
Viewed by 186
Abstract
Practical problems often drive the development of new statistical methods by presenting real-world challenges. Testing the homogeneity of n independent Poisson means when only one observation per population is available is considered in this paper. This scenario is common in fields where limited [...] Read more.
Practical problems often drive the development of new statistical methods by presenting real-world challenges. Testing the homogeneity of n independent Poisson means when only one observation per population is available is considered in this paper. This scenario is common in fields where limited data from multiple sources must be analyzed to determine whether different groups share the same underlying event rate or mean. These settings often exhibit underlying structural or spatial symmetries that influence statistical behavior. Traditional methods that rely on large sample sizes are not applicable. Hence, it is crucial to develop techniques tailored to the constraints of single observations. Under the null hypothesis, with large n and a fixed common mean λ, the likelihood ratio test statistic (LRTS) is shown to be asymptotically normally distributed, with the mean and variance being approximated by a truncation method and a parametric bootstrap method. Moreover, with fixed n and large λ, under the null hypothesis, the LRTS is shown to be asymptotically distributed as a chi-square with n1 degrees of freedom. The Bartlett correction method is applied to improve the accuracy of the asymptotic distribution of the LRTS. We highlight the practical relevance of the proposed method through applications to wildfire and radioactive event data, where correlated observations and sparse sampling are common. Simulation studies further demonstrate the accuracy and robustness of the test under various scenarios, making it well-suited for modern applications in environmental science and risk assessment. Full article
(This article belongs to the Special Issue Mathematics: Feature Papers 2025)
Show Figures

Figure 1

32 pages, 1049 KB  
Article
An Approximate Bayesian Approach to Optimal Input Signal Design for System Identification
by Piotr Bania and Anna Wójcik
Entropy 2025, 27(10), 1041; https://doi.org/10.3390/e27101041 - 7 Oct 2025
Viewed by 295
Abstract
The design of informatively rich input signals is essential for accurate system identification, yet classical Fisher-information-based methods are inherently local and often inadequate in the presence of significant model uncertainty and non-linearity. This paper develops a Bayesian approach that uses the mutual information [...] Read more.
The design of informatively rich input signals is essential for accurate system identification, yet classical Fisher-information-based methods are inherently local and often inadequate in the presence of significant model uncertainty and non-linearity. This paper develops a Bayesian approach that uses the mutual information (MI) between observations and parameters as the utility function. To address the computational intractability of the MI, we maximize a tractable MI lower bound. The method is then applied to the design of an input signal for the identification of quasi-linear stochastic dynamical systems. Evaluating the MI lower bound requires the inversion of large covariance matrices whose dimensions scale with the number of data points N. To overcome this problem, an algorithm that reduces the dimension of the matrices to be inverted by a factor of N is developed, making the approach feasible for long experiments. The proposed Bayesian method is compared with the average D-optimal design method, a semi-Bayesian approach, and its advantages are demonstrated. The effectiveness of the proposed method is further illustrated through four examples, including atomic sensor models, where input signals that generate a large amount of MI are especially important for reducing the estimation error. Full article
(This article belongs to the Section Signal and Data Analysis)
Show Figures

Figure 1

22 pages, 13067 KB  
Article
Numerical Modeling of Photovoltaic Cells with the Meshless Global Radial Basis Function Collocation Method
by Murat Ispir and Tayfun Tanbay
Energies 2025, 18(19), 5267; https://doi.org/10.3390/en18195267 - 3 Oct 2025
Viewed by 293
Abstract
Accurate prediction of photovoltaic performance hinges on resolving the electron density in the P-region and the hole density in the N-region. Motivated by this need, we present a comprehensive assessment of a meshless global radial basis function (RBF) collocation strategy for the steady [...] Read more.
Accurate prediction of photovoltaic performance hinges on resolving the electron density in the P-region and the hole density in the N-region. Motivated by this need, we present a comprehensive assessment of a meshless global radial basis function (RBF) collocation strategy for the steady current continuity equation, covering a one-dimensional two-region P–N junction and a two-dimensional single-region problem. The study employs Gaussian (GA) and generalized multiquadric (GMQ) bases, systematically varying shape parameter and node density, and presents a detailed performance analysis of the meshless method. Results map the accuracy–stability–computation-time landscape: GA achieves faster convergence but over a narrower stability window, whereas GMQ exhibits greater robustness to shape-parameter variation. We identify stability plateaus that preserve accuracy without severe ill-conditioning and quantify the runtime growth inherent to dense global collocation. A utopia-point multi-objective optimization balances error and computation time to yield practical node-count guidance; for the two-dimensional case with equal weighting, an optimum of 19 intervals per side emerges, largely insensitive to the RBF choice. Collectively, the results establish global RBF collocation as a meshless, accurate, and systematically optimizable alternative to conventional mesh-based solvers for high-fidelity carrier-density prediction in P-N junctions, thereby enabling more reliable performance analysis and design of photovoltaic devices. Full article
Show Figures

Figure 1

20 pages, 1498 KB  
Article
Predicting the Structure of Hydrogenase in Microalgae: The Case of Nannochloropsis salina
by Simone Botticelli, Cecilia Faraloni and Giovanni La Penna
Hydrogen 2025, 6(4), 77; https://doi.org/10.3390/hydrogen6040077 - 2 Oct 2025
Viewed by 373
Abstract
The production of green hydrogen by microalgae is a promising strategy to convert energy of sun light into a carbon-free fuel. Many problems must be solved before large-scale industrial applications. One solution is to find a microalgal species that is easy to grow, [...] Read more.
The production of green hydrogen by microalgae is a promising strategy to convert energy of sun light into a carbon-free fuel. Many problems must be solved before large-scale industrial applications. One solution is to find a microalgal species that is easy to grow, easy to manipulate, and that can produce hydrogen open-air, thus in the presence of oxygen, for periods of time as long as possible. In this work we investigate by means of predictive computational models, the [FeFe] hydrogenase enzyme of Nannochloropsis salina, a promising microcalga already used to produce high-value products in salt water. Catalysis of water reduction to hydrogen by [FeFe] hydrogenase occurs in a peculiar iron-sulfur cluster (H-cluster) contained into a conserved H-domain, well represented by the known structure of the single-domain enzyme in Chlamydomonas reinhardtii (457 residues). By combining advanced deep-learning and molecular simulation methods we propose for N. salina a two-domain enzyme architecture hosting five iron-sulfur clusters. The enzyme organization is allowed by the protein size of 708 residues and by its sequence rich in cysteine and histidine residues mostly binding Fe atoms. The structure of an extended F-domain, containing four auxiliary iron-sulfur clusters and interacting with both the reductant ferredoxin and the H-domain, is thus predicted for the first time for microalgal [FeFe] hydrogenase. The structural study is the first step towards further studies of the microalga as a microorganism producing pure hydrogen gas. Full article
Show Figures

Figure 1

24 pages, 4669 KB  
Article
User Comfort Evaluation in a Nearly Zero-Energy Housing Complex in Poland: Indoor and Outdoor Analysis
by Małgorzata Fedorczak-Cisak, Elżbieta Radziszewska-Zielina, Mirosław Dechnik, Aleksandra Buda-Chowaniec, Beata Sadowska, Michał Ciuła and Tomasz Kapecki
Energies 2025, 18(19), 5209; https://doi.org/10.3390/en18195209 - 30 Sep 2025
Viewed by 275
Abstract
The building sector plays a key role in the transition toward climate neutrality, with national regulations across the EU requiring the construction of nearly zero-energy buildings (nZEBs). However, while energy performance has been extensively studied, less attention has been given to the problem [...] Read more.
The building sector plays a key role in the transition toward climate neutrality, with national regulations across the EU requiring the construction of nearly zero-energy buildings (nZEBs). However, while energy performance has been extensively studied, less attention has been given to the problem of ensuring user comfort—both indoors and in the surrounding outdoor areas—under nZEB design constraints. This gap raises two key research objectives: (1) to evaluate whether a well-designed nZEB with extensive glazing maintains acceptable indoor thermal comfort and (2) to assess whether residents experience greater outdoor thermal comfort and satisfaction in small, sun-exposed private gardens or in larger, shaded communal green spaces. To address these objectives, a newly built residential estate near Kraków (Poland) was analyzed. The investigation included simulation-based assessments during the design phase and in situ measurements during building operation, complemented by a user survey on spatial preferences. Indoor comfort was evaluated for rooms with large glazed façades, as well as rooms with standard-sized windows, while outdoor comfort was assessed in both private gardens and a shared green courtyard. Results show that shading the southwest-oriented glazed façade with an overhanging terrace provided slightly lower temperatures in ground-floor rooms compared to rooms with standard unshaded windows. Outdoors, users experienced lower thermal comfort in small, unshaded gardens than in the larger, vegetated communal area (pocket park), which demonstrated greater capacity for temperature moderation and thermal stress reduction. Survey responses further indicate that potential future residents prefer the inclusion of a shared green–blue infrastructure area, even at the expense of building some housing units in semi-detached form, instead of maximizing the number of detached units with unshaded individual gardens. These findings emphasize the importance of addressing both indoor and outdoor comfort in residential nZEB design, showing that technological efficiency must be complemented by user-centered design strategies. This integrated approach can improve the well-being of residents while supporting climate change adaptation in the built environment. Full article
Show Figures

Figure 1

10 pages, 304 KB  
Article
Temporal Relationships Between Occupational Exposure to High Molecular Weight Allergens and Associated Short Latency Respiratory Health Outcomes: Laboratory Animal Allergens
by Howard Mason, Kate Jones and Laura Byrne
Laboratories 2025, 2(4), 19; https://doi.org/10.3390/laboratories2040019 - 29 Sep 2025
Viewed by 270
Abstract
Occupational asthma (OA) and rhinitis are health problems occurring in facilities employing animals for medical and scientific reasons. We have compared the UK trends (2006–2023) in these outcomes reported to the SWORD scheme with changes in routine and personal air monitoring for the [...] Read more.
Occupational asthma (OA) and rhinitis are health problems occurring in facilities employing animals for medical and scientific reasons. We have compared the UK trends (2006–2023) in these outcomes reported to the SWORD scheme with changes in routine and personal air monitoring for the major mouse (Mus m 1) and rat (Rat n 1) allergens. The exposure data contained 1540 and 688 mouse and rat results, respectively, expressed in ng.m−3. The median, 75th and 90th percentiles were used as exposure characteristics, and annually incrementing three-yearly rolling data slices compared exposure and health outcomes by linear regression. The median, P75 and P90 for Mus m 1 all showed annual declines of around 5–6% (p < 0.001), suggesting general improvements in controlling mouse allergen exposure, but without evidence of a decline in rat allergen levels (p > 0.05), although control measures for both species are largely identical. An annual mean decline in OA of 2.9% (p = 0.021) was identified, but without a significant decline in rhinitis (−1.4%; p = 0.21). Over 16 years, reductions in exposure to the predominant rodent species were accompanied by a concomitant but smaller reduction in OA. These data confirm the immediate value of controlling relevant allergen exposure in reducing the incidence of IgE-E mediated OA. Full article
(This article belongs to the Special Issue Laboratory Preparedness for Emerging Infectious Diseases)
Show Figures

Figure 1

28 pages, 2180 KB  
Article
Entropy-Based Uncertainty Quantification in Linear Consecutive k-out-of-n:G Systems via Cumulative Residual Tsallis Entropy
by Boshra Alarfaj, Mohamed Kayid and Mashael A. Alshehri
Entropy 2025, 27(10), 1020; https://doi.org/10.3390/e27101020 - 28 Sep 2025
Viewed by 273
Abstract
Quantifying uncertainty in complex systems is a central problem in reliability analysis and engineering applications. In this work, we develop an information-theoretic framework for analyzing linear consecutive k-out-of-n:G systems using the cumulative residual Tsallis entropy (CRTE). A general analytical expression for CRTE is [...] Read more.
Quantifying uncertainty in complex systems is a central problem in reliability analysis and engineering applications. In this work, we develop an information-theoretic framework for analyzing linear consecutive k-out-of-n:G systems using the cumulative residual Tsallis entropy (CRTE). A general analytical expression for CRTE is derived, and its behavior is investigated under various stochastic ordering relations, providing insight into the reliability of systems governed by continuous lifetime distributions. To address challenges in large-scale settings or with nonstandard lifetimes, we establish analytical bounds that serve as practical tools for uncertainty quantification and reliability assessment. Beyond theoretical contributions, we propose a nonparametric CRTE-based test for dispersive ordering, establish its asymptotic distribution, and confirm its statistical properties through extensive Monte Carlo simulations. The methodology is further illustrated with real lifetime data, highlighting the interpretability and effectiveness of CRTE as a probabilistic entropy measure for reliability modeling. The results demonstrate that CRTE provides a versatile and computationally feasible approach for bounding analysis, characterization, and inference in systems where uncertainty plays a critical role, aligning with current advances in entropy-based uncertainty quantification. Full article
(This article belongs to the Special Issue Uncertainty Quantification and Entropy Analysis)
Show Figures

Figure 1

21 pages, 5935 KB  
Article
A Superhydrophobic Gel Fracturing Fluid with Enhanced Structural Stability and Low Reservoir Damage
by Qi Feng, Quande Wang, Naixing Wang, Guancheng Jiang, Jinsheng Sun, Jun Yang, Tengfei Dong and Leding Wang
Gels 2025, 11(10), 772; https://doi.org/10.3390/gels11100772 - 25 Sep 2025
Viewed by 330
Abstract
Conventional fracturing fluids, while essential for large-volume stimulation of unconventional reservoirs, often induce significant reservoir damage through water retention and capillary trapping. To address this problem, this study developed a novel superhydrophobic nano-viscous drag reducer (SN-DR), synthesized through a multi-monomer copolymerization and silane [...] Read more.
Conventional fracturing fluids, while essential for large-volume stimulation of unconventional reservoirs, often induce significant reservoir damage through water retention and capillary trapping. To address this problem, this study developed a novel superhydrophobic nano-viscous drag reducer (SN-DR), synthesized through a multi-monomer copolymerization and silane modification strategy, which enhances structural stability and minimizes reservoir damage. The structure and thermal stability of SN-DR were characterized by FT-IR, 1H NMR, and TGA. Rheological evaluations demonstrated that the gel fracturing fluid exhibits a highly stable three-dimensional network structure, with a G′ maintained at approximately 3000 Pa and excellent shear recovery under cyclic stress. Performance tests showed that a 0.15% SN-DR achieved a drag reduction rate of 78.1% at 40 L/min, reduced oil–water interfacial tension to 0.91 mN·m−1, and yielded a water contact angle of 152.07°, confirming strong hydrophobicity. Core flooding tests revealed a flowback rate exceeding 50% and an average permeability recovery of 86%. SEM and EDS indicated that the gel formed nanoscale, tightly packed papillary structures on core surfaces, enhancing roughness and reducing water intrusion. The study demonstrates that gel fracturing fluid enhances structural stability, alters wettability, and mitigates water-blocking damage. These findings offer a new strategy for designing high-performance fracturing fluids with integrated drag reduction and reservoir protection properties, providing significant theoretical insights for improving hydraulic fracturing efficiency. Full article
(This article belongs to the Section Gel Applications)
Show Figures

Figure 1

16 pages, 417 KB  
Article
Central Sensitization Syndromes and Trauma: Mediating Role of Sleep Quality, Pain Catastrophizing, and Emotional Dysregulation Between Post-Traumatic Stress Disorder and Pain
by Elena Miró, Ana Isabel Sánchez, Ada Raya and María Pilar Martínez
Healthcare 2025, 13(17), 2221; https://doi.org/10.3390/healthcare13172221 - 4 Sep 2025
Viewed by 1143
Abstract
Background: Central sensitization syndromes (CSSs) are associated with a high incidence of traumatic events; however, few studies have examined the potential mechanisms linking post-traumatic stress disorder (PTSD) and pain. Objectives: The present research aims to clarify this association by exploring the presence of [...] Read more.
Background: Central sensitization syndromes (CSSs) are associated with a high incidence of traumatic events; however, few studies have examined the potential mechanisms linking post-traumatic stress disorder (PTSD) and pain. Objectives: The present research aims to clarify this association by exploring the presence of trauma, PTSD, and related clinical variables in participants with CSSs compared to healthy controls and those with medical problems. Methods: A large sample of both sexes of the Spanish general population (n = 1542; aged 18–84 years) completed an online survey assessing the presence of traumatic experiences (psychological trauma, physical trauma, physical and sexual abuse), PTSD, and other clinical measures (central sensitization, pain, sleep quality, anxiety, depression, perceived stress, and emotional regulation). Results: The CSS group (n = 467) showed a higher incidence of repeated trauma, PTSD, and dissociative symptoms compared to the medical pathologies (n = 214) and healthy (n = 861) groups. The CSS group also showed greater clinical impairment than the other groups, especially the CSS subgroup with PTSD. In this subgroup, PTSD symptoms were significantly correlated with the remaining clinical measures, and sleep dysfunction, pain catastrophizing, and emotional dysregulation mediated the relationship between PTSD and pain, accounting for 55.3% of the variance. Conclusions: CSS represents a major therapeutic challenge. An integrated treatment addressing both trauma and pain, with an emphasis on sleep quality, pain catastrophizing, and emotional regulation, could improve the effectiveness of the current therapeutic approaches. Full article
(This article belongs to the Section Pain Management)
Show Figures

Figure 1

16 pages, 3175 KB  
Article
Research and Optimization of Key Technologies for Manure Cleaning Equipment Based on a Profiling Wheel Mechanism
by Fengxin Yan, Can Gao, Lishuang Ren, Jiahao Li and Yuanda Gao
AgriEngineering 2025, 7(9), 287; https://doi.org/10.3390/agriengineering7090287 - 3 Sep 2025
Viewed by 685
Abstract
This study addresses the problems of poor dynamic stability, high vibration coupling, and inefficient energy use in large-farm manure handling machines. A profiling wheel-based multi-disciplinary approach is proposed in the study. With the rocker arm prototype, double-ball heads, and a hydraulic damping system, [...] Read more.
This study addresses the problems of poor dynamic stability, high vibration coupling, and inefficient energy use in large-farm manure handling machines. A profiling wheel-based multi-disciplinary approach is proposed in the study. With the rocker arm prototype, double-ball heads, and a hydraulic damping system, a parametric design is built that includes vibration and energy consumption. The simulation results in EDEM2022 and ANSYS2022 prove the structure viability and motion compensation capability, while NSGA-II optimizes the damping parameters (k1 = 380 kN/m, C = 1200 Ns/m). The results show a 14.7% σFc reduction, 14.3% αRMS decrease, resonance avoidance (14–18 Hz), Δx (horizontal offset of the frame) < 5 mm, 18% power loss to 12.5%, and 62% stability improvement. The new research includes constructing a dynamic model by combining the Hertz contact theory with the modal decoupling method, while interacting with an automatic algorithm of adaptive damping and a mechanical-hydraulic-control-oriented optimization platform. Future work could integrate lightweight materials and multi-machine collaboration for smarter, greener manure cleaning. Full article
(This article belongs to the Section Agricultural Mechanization and Machinery)
Show Figures

Figure 1

25 pages, 5006 KB  
Article
Incorporating Finite Particle Number and Heat-Temperature Differences in the Maxwell–Boltzmann Speed Distribution
by Everett M. Criss and Anne M. Hofmeister
Foundations 2025, 5(3), 29; https://doi.org/10.3390/foundations5030029 - 25 Aug 2025
Viewed by 676
Abstract
The often used analytical representation of the Maxwell–Boltzmann classical speed distribution function (F) for elastic, indivisible particles assumes an infinite limit for the speed. Consequently, volume and the number of particles (n) extend to infinity: Both infinities contradict assumptions [...] Read more.
The often used analytical representation of the Maxwell–Boltzmann classical speed distribution function (F) for elastic, indivisible particles assumes an infinite limit for the speed. Consequently, volume and the number of particles (n) extend to infinity: Both infinities contradict assumptions underlying this non-relativistic formulation. Finite average kinetic energy and temperature (T) result from normalization of F removing n: However, total energy (i.e., heat of the collection) remains infinite because n is infinite. This problem persists in recent adaptations. To better address real (finite) systems, wherein T depends on heat, we generalize this one-parameter distribution (F, cast in energy) by proposing a two-parameter gamma distribution function (F*) in energy which reduces to F at large n. Its expectation value of kT (k = Boltzmann’s constant) replicates F, whereas the shape factor depends on n and affects the averages, as expected for finite systems. We validate F* via a first-principle, molecular dynamics numerical model of energy and momentum conserving collisions for 26, 182, and 728 particles in three-dimensional physical space. Dimensionless calculations provide generally applicable results; a total of 107 collisions suffice to represent an equilibrated collection. Our numerical results show that individual momentum conserving collisions in three-dimensions provide symmetrical speed distributions in all Cartesian directions. Thus, momentum and energy conserving collisions are the physical cause for equipartitioning of energy: Validity of this theorem for other systems depends on their specific motions. Our numerical results set upper limits on kinetic energy of individual particles; restrict the n particles to some finite volume; and lead to a formula in terms of n for conserving total energy when utilizing F* for convenience. Implications of our findings on matter under extreme conditions are briefly discussed. Full article
(This article belongs to the Section Physical Sciences)
Show Figures

Figure 1

Back to TopTop