Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (28,394)

Search Parameters:
Keywords = production parameters

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
16 pages, 1406 KB  
Article
Biomechanical Voice Parameters as Potential Biomarkers for Phenotype Differentiation in Amyotrophic Lateral Sclerosis: A Cross-Sectional Study
by Margarita Pérez-Bonilla, Marina Mora-Ortiz, Paola Díaz-Borrego, María Nieves Muñoz-Alcaraz, Fernando J. Mayordomo-Riera and Eloy Girela-López
Med. Sci. 2026, 14(1), 112; https://doi.org/10.3390/medsci14010112 (registering DOI) - 26 Feb 2026
Abstract
Background/Objectives: Amyotrophic lateral sclerosis (ALS) is a clinically heterogeneous neurodegenerative disease in which bulbar involvement frequently affects speech and voice production. Although acoustic voice analysis can detect phonatory alterations in ALS, its ability to differentiate clinical phenotypes remains limited. This study investigated [...] Read more.
Background/Objectives: Amyotrophic lateral sclerosis (ALS) is a clinically heterogeneous neurodegenerative disease in which bulbar involvement frequently affects speech and voice production. Although acoustic voice analysis can detect phonatory alterations in ALS, its ability to differentiate clinical phenotypes remains limited. This study investigated whether biomechanical voice parameters provide complementary information for characterizing bulbar involvement across bulbar-onset ALS (ALS-B) and spinal-onset ALS (ALS-S) and explored their association with clinical and functional measures. Methods: This cross-sectional observational study included 50 patients with ALS (20 ALS-B, 30 ALS-S) and 50 controls with non-neurological voice disorders. Sustained vowel phonation was analyzed using acoustic measures and biomechanical voice parameters derived from a standardized model of vocal fold vibration. Perceptual voice severity was assessed using the GRBAS scale, while functional status was evaluated with the ALS Functional Rating Scale–Revised (ALSFRS-R) and the Barthel Index. Associations with clinical measures were explored in secondary analyses. Results: Compared with controls, ALS patients showed significant differences in acoustic measures and several biomechanical parameters related to glottal closure and vibratory stability. Biomechanical analysis revealed significant differences between ALS-B and ALS-S, particularly in parameters reflecting vibratory asymmetry, glottal tension and cycle-to-cycle instability. Unexpectedly, ALS-B showed greater perceptual voice severity and higher Barthel Index scores than ALS-S, while no differences were observed in global ALSFRS-R total scores. Conclusions: Biomechanical voice analysis appears to capture physiologically meaningful alterations in vocal fold function in ALS and provides complementary information for characterizing bulbar motor involvement across clinical phenotypes, particularly ALS-B disease. When combined with acoustic and clinical assessments, this approach may enhance the evaluation of bulbar involvement and functional status in ALS. Full article
Show Figures

Figure 1

20 pages, 820 KB  
Article
A Risk-Based Universal Calibration Interval Model Using Monte Carlo Simulation
by Dmytro Malakhov, Tatiana Kelemenová and Michal Kelemen
Appl. Sci. 2026, 16(5), 2230; https://doi.org/10.3390/app16052230 (registering DOI) - 26 Feb 2026
Abstract
Sustainable manufacturing requires modern intelligent approaches to monitoring products of the manufacturing process. An integral part of intelligent manufacturing is the measurement of geometric parameters of products, which allows diagnosing the state of the manufacturing process, optimizing it and predicting its further development. [...] Read more.
Sustainable manufacturing requires modern intelligent approaches to monitoring products of the manufacturing process. An integral part of intelligent manufacturing is the measurement of geometric parameters of products, which allows diagnosing the state of the manufacturing process, optimizing it and predicting its further development. For these reasons, it is necessary to monitor the condition of measuring instruments, as decision-making is based on the data provided by them. Calibration intervals of measuring instruments are commonly defined using fixed time-based rules that are not explicitly linked to measurement uncertainty growth or conformity risk. This practice may lead to either unnecessary recalibration or an increased probability of using out-of-tolerance instruments. In this study, a Monte Carlo-based methodology for determining recalibration intervals is proposed, in which recalibration decisions are derived from the probabilistic evolution of measurement error over time. Measurement uncertainty is modeled as a time-dependent stochastic process combining calibration uncertainty, drift behavior, and repeatability. Monte Carlo simulation is used to propagate uncertainty and to estimate both the expanded uncertainty and the probability that the measurement error exceeds the maximum permissible error (MPE). The recalibration interval is defined as the earliest time at which this probability exceeds a predefined acceptable risk threshold. A numerical experiment using realistic synthetic data representative of a typical dimensional measuring instrument demonstrates that probability-based and uncertainty-based criteria may lead to substantially different recalibration intervals. The results confirm that risk-informed recalibration intervals provide a more transparent and metrologically justified alternative to fixed schedules while remaining fully compatible with ISO/IEC 17025 and GUM principles. The proposed approach is instrument-agnostic and readily applicable in calibration laboratories and industrial measurement systems. Full article
(This article belongs to the Special Issue Advanced Digital Design and Intelligent Manufacturing, 2nd Edition)
Show Figures

Figure 1

14 pages, 525 KB  
Article
Associations of Blood Lactate Dehydrogenase Activity with Blood Biochemical and Automated Milk Monitoring Parameters in Early-Lactation Dairy Cows
by Akvilė Girdauskaitė, Samanta Grigė, Inga Sabeckienė, Karina Džermeikaitė, Justina Krištolaitytė, Zoja Miknienė, Mindaugas Televičius, Lina Anskienė, Dovilė Malašauskienė and Ramūnas Antanaitis
Agriculture 2026, 16(5), 502; https://doi.org/10.3390/agriculture16050502 (registering DOI) - 25 Feb 2026
Abstract
Lactate dehydrogenase (LDH) is widely used as a nonspecific marker of tissue damage and cellular turnover and has been associated with metabolic and inflammatory processes, but its relationship with automated monitoring data and blood biochemical indicators in early-lactation dairy cows is still not [...] Read more.
Lactate dehydrogenase (LDH) is widely used as a nonspecific marker of tissue damage and cellular turnover and has been associated with metabolic and inflammatory processes, but its relationship with automated monitoring data and blood biochemical indicators in early-lactation dairy cows is still not well described. The aim of this study was to evaluate associations between LDH activity, blood biochemical parameters, and automated monitoring indicators in early-lactation Holstein cows. A total of 91 clinically healthy cows were classified into two groups according to LDH activity: Group 1 (LDH < 1364 U/L; n = 53) and Group 2 (LDH ≥ 1364 U/L; n = 38). Blood samples were collected once per cow during early lactation, whereas automated monitoring parameters were continuously recorded and daily averages corresponding to the sampling day were used for analysis. Cows with higher LDH activity had significantly higher aspartate aminotransferase (AST) activity and moderate increases in albumin (ALB), creatinine (CREA), gamma-glutamyl transferase (GGT), calcium (Ca), phosphorus (PHOS), and iron (Fe). Correlation analysis showed a strong positive association between LDH and AST (r = 0.799, p < 0.001), while moderate positive correlations were observed with ALB, alanine aminotransferase (ALT), CREA, Ca, GGT, Fe, and PHOS. Receiver operating characteristic (ROC) analysis showed the best discrimination ability for AST, while CREA, ALB, Fe, PHOS, Ca, and GGT showed moderate classification performance. Automated monitoring parameters did not differ significantly between groups; however, cows with higher LDH activity tended to show lower rumination time together with higher milk electrical conductivity, higher milk yield, higher fat-to-protein ratio (FPR), and higher somatic cell count (SCC). Overall, the results indicate that LDH is more closely related to systemic biochemical variation than to immediate changes in production or behavioral indicators, and support the use of biochemical markers together with automated monitoring data when evaluating physiological adaptation during early lactation. Full article
(This article belongs to the Section Farm Animal Production)
29 pages, 4431 KB  
Article
Integrating CO2-EOR and Sequestration via Assisting Steam Huff and Puff in Offshore Heavy Oil Reservoirs with Bottom Water
by Guodong Cui, Kaijun Yuan, Haiqing Cheng, Quanqi Dai, Xi Chen, Rui Wang, Zhe Hu and Zheng Niu
J. Mar. Sci. Eng. 2026, 14(5), 423; https://doi.org/10.3390/jmse14050423 (registering DOI) - 25 Feb 2026
Abstract
CO2-assisted steam huff and puff is an effective method to improve oil recovery and store CO2 in heavy oil reservoirs. However, few studies focused on complex geological formations, such as bottom water. The bottom water condition not only complicates the [...] Read more.
CO2-assisted steam huff and puff is an effective method to improve oil recovery and store CO2 in heavy oil reservoirs. However, few studies focused on complex geological formations, such as bottom water. The bottom water condition not only complicates the process of oil production and CO2 sequestration, but also makes migration and distribution of oil, water and CO2 unclear. In this paper, a numerical geological model of an offshore heavy oil reservoir with bottom water is established to analyze the influence of bottom water on injection and production parameters, oil recovery and CO2 storage capability under vertical and horizontal well layouts. The results show that the bottom water could maintain the formation pressure, but reduce the steam chamber radius and heavy oil utilization area, increase water production and decrease the oil–water ratio. CO2 could enhance oil recovery in the bottom water reservoir. Oil development indicators of the horizontal well are higher than the vertical well. Meanwhile, CO2-assisted steam huff and puff use in the bottom water reservoir can create a high-pressure and -temperature environment to make CO2 supercritical, as it has better CO2 storage capability and efficiency. The CO2 storage efficiency of the horizontal well is 63% larger than the vertical well. Thus, the horizontal well layout should be used as a priority if bottom water presents. Conducted analysis of bottom water formation sensitivity parameters shows that the advantageous formation conditions are high oil saturation, porosity of 0.2–0.4 and permeability of 2000–3000 mD. The influence degrees of each formation parameter were evaluated as well. Full article
(This article belongs to the Section Marine Energy)
Show Figures

Figure 1

30 pages, 2996 KB  
Article
The State and Development Directions of Polish Waste-to-Energy Plants in Improving R1-Based Energy Recovery Performance
by Marian Banaś, Tadeusz Pająk, Wojciech Wróbel and Józef Ciuła
Energies 2026, 19(5), 1143; https://doi.org/10.3390/en19051143 (registering DOI) - 25 Feb 2026
Abstract
The paper presents an analysis of the status and development trends of Polish Waste-to-Energy (WtE) installations in the context of improving the level of energy recovery measured by the R1 indicator of the Waste Framework Directive (R1 is a regulatory indicator of the [...] Read more.
The paper presents an analysis of the status and development trends of Polish Waste-to-Energy (WtE) installations in the context of improving the level of energy recovery measured by the R1 indicator of the Waste Framework Directive (R1 is a regulatory indicator of the R1/D10 classification, not the thermodynamic efficiency of the installation). Based on the standardised annual operating energy balances of six mature municipal waste incineration plants from 2020 to 2024 and partial data for 2025, electricity and heat production, auxiliary media consumption and waste fuel parameters were compared, and R1 was calculated in the Ep, Ef, Ew and Ei systems. The R1 values were then compared with heat collection conditions and modernisation implementations (integration with the heating network, exhaust gas condensation, advanced control/predictive algorithms), treating the ‘before/after’ comparisons as an observational assessment, without inferring strict causality. The average R1 for the facilities studied in 2020–2024 was 0.864, with the highest values recorded for installations in Kraków (R1 = 1.123 in 2024). The results indicate that a high and growing R1 is primarily associated with cogeneration and stable heat management in district heating systems, and that upgrades aimed at additional heat recovery and process stabilisation can further support this trend, in line with the ‘energy efficiency first’ principle. A novelty of the study is the standardised, long-term benchmarking of full-scale data for six installations using a uniform R1 methodology. Full article
(This article belongs to the Collection Energy Efficiency and Environmental Issues)
Show Figures

Figure 1

40 pages, 2259 KB  
Article
Multi-Group Fully Homomorphic Encryption Scheme Based on LWE and NTRU
by Yongheng Li, Jing Wen, Shaoling Liang, Fanqi Kong and Baohua Huang
Electronics 2026, 15(5), 940; https://doi.org/10.3390/electronics15050940 (registering DOI) - 25 Feb 2026
Abstract
Multi-group homomorphic encryption (MGHE) is a pivotal advance in secure multi-party computation, integrating merits of multi-party homomorphic encryption (MPHE) and multi-key homomorphic encryption (MKHE) to eliminate MPHE’s fixed-party limitation and mitigate MKHE’s ciphertext expansion from dynamic enrollment. However, the efficient single-key FINAL scheme [...] Read more.
Multi-group homomorphic encryption (MGHE) is a pivotal advance in secure multi-party computation, integrating merits of multi-party homomorphic encryption (MPHE) and multi-key homomorphic encryption (MKHE) to eliminate MPHE’s fixed-party limitation and mitigate MKHE’s ciphertext expansion from dynamic enrollment. However, the efficient single-key FINAL scheme cannot extend to multi-party scenarios due to the challenge of defining valid multiplication for vector NTRU ciphertexts, which hinders its use in multi-group bootstrapping and curbs efficiency. To address this, additive secret sharing was adopted to convert vector NTRU ciphertext multiplication into secret share multiplication, enabling shared bootstrapping key generation within groups. A new multi-group ciphertext bootstrapping algorithm for MGHE was developed via the integration of LWE and NTRU cryptographic primitives. Bootstrapping tasks were decomposed for parallel processing, and a hybrid product algorithm was designed to aggregate subtask outputs, boosting multi-group bootstrapping speed to match that of single-key ciphertexts. Noise accumulation was analyzed, with 100-bit and 128-bit security parameter sets selected for validation. Experiments showed that 30- and 50-party multi-group bootstrapping takes only 1.87 s and 2.58 s respectively. Full article
16 pages, 8594 KB  
Article
Microstructure and Mechanical Properties of Aluminum Alloy Studs Using Wire–Laser Directed Energy Deposition
by Fawu Xiang, Jiangang Wang, Likun Yang, Hui Gao, Yingying Huang and Haihe Jiang
J. Manuf. Mater. Process. 2026, 10(3), 78; https://doi.org/10.3390/jmmp10030078 (registering DOI) - 25 Feb 2026
Abstract
In this study, an annular laser beam shaping optics and a wire feeding system are used for additive manufacturing. A discrete concentric layering trajectory strategy (DCL-TS) and a continuous deposition trajectory strategy (CD-TS) for the laser-directed energy deposition (WL-DED) of aluminum alloy stud [...] Read more.
In this study, an annular laser beam shaping optics and a wire feeding system are used for additive manufacturing. A discrete concentric layering trajectory strategy (DCL-TS) and a continuous deposition trajectory strategy (CD-TS) for the laser-directed energy deposition (WL-DED) of aluminum alloy stud structures are developed. Initially, combinations of parameters, such as laser power, transverse speed, and wire feeding speed, which lead to a process that produces a single-layer structure with good morphology and no visible pores and cracks, are identified. Then, DCL-TS and CD-TS manufacturing strategies are used to produce aluminum alloy studs of similar dimensions. The EBSD results indicate that the CD-TS produces finer grains in the aluminum alloy studs compared to the DCL-TS; correspondingly, mechanical testing reveals superior microhardness and tensile strength in the circularly fabricated studs. The latter tensile value testing verifies that aluminum alloy studs using WL-DED on the substrate can meet the requirements for practical application in mobile phones, computers, etc. This research method enhances the mechanical properties of additively manufactured items. Consequently, manufacturing efficiency is significantly improved, providing a promising solution for rapid production. Full article
Show Figures

Figure 1

33 pages, 4439 KB  
Article
A 3M Framework for Gross Ecosystem Product Valuation in Natural Protected Areas: Integrating Parameter Localization with Uncertainty Analysis
by Qing Zhang, Jiangzhou Wu, Tianyu Cen and Yongde Zhong
Sustainability 2026, 18(5), 2216; https://doi.org/10.3390/su18052216 (registering DOI) - 25 Feb 2026
Abstract
Natural protected areas harbor ecosystems with significant ecological functions and economic value. The scientific accounting of Gross Ecosystem Product (GEP) is therefore critical for harmonizing ecological conservation with regional development. Using China’s Xilingol Grassland National Nature Reserve as a case study, this paper [...] Read more.
Natural protected areas harbor ecosystems with significant ecological functions and economic value. The scientific accounting of Gross Ecosystem Product (GEP) is therefore critical for harmonizing ecological conservation with regional development. Using China’s Xilingol Grassland National Nature Reserve as a case study, this paper develops and applies a novel “3M” GEP accounting framework, integrating the three core elements of multi-dimensional indicators, multi-source data, and multi-method adaptation. This framework was employed to systematically quantify the values of the reserve’s provisioning, regulating, and cultural ecosystem services. The results show an annual GEP of CNY 170.5229 billion for the 5835.65 km2 reserve. Regulating services constituted the dominant share (97.77%), with climate regulation being the most significant component (CNY 160.15 billion). It is important to note that this high proportion is method-dependent, stemming from the industrial-substitution scenarios used to value non-market services. The combined contribution of provisioning and cultural services was 2.23%, representing 1.00% and 1.23%, respectively. Uncertainty analysis indicated a total error margin of ±9.3% (95% confidence interval), which is within an acceptable range for ecological accounting. The primary sources of uncertainty were data-resolution limitations, methodological choices, and regional parameter variability. These findings, corroborated by sensitivity analysis, confirm the robustness of the GEP estimate and clarify the influence of key ecological parameters on the valuation. By optimizing regional indicator adaptation, methodological localization, and multi-source data cross-validation, the proposed framework enhances the accuracy and policy relevance of ecosystem service valuation. It thus provides a methodological reference for GEP accounting and ecological asset management in other natural protected areas. Full article
(This article belongs to the Section Sustainable Products and Services)
Show Figures

Figure 1

22 pages, 1996 KB  
Article
Lightweight Self-Supervised Hybrid Learning for Generalizable and Real-Time Fault Diagnosis in Photovoltaic Systems
by Ghalia Nassreddine, Obada Al-Khatib, Imran, Mohamad Nassereddine and Ali Hellany
Algorithms 2026, 19(3), 173; https://doi.org/10.3390/a19030173 - 25 Feb 2026
Abstract
Photovoltaic (PV) systems nowadays represent an essential component of renewable energy production. However, undetected faults often compromise their reliability, leading to significant energy losses and high maintenance costs. Existing deep learning approaches for PV fault diagnosis have achieved high accuracy, but they require [...] Read more.
Photovoltaic (PV) systems nowadays represent an essential component of renewable energy production. However, undetected faults often compromise their reliability, leading to significant energy losses and high maintenance costs. Existing deep learning approaches for PV fault diagnosis have achieved high accuracy, but they require massive, labeled datasets and high computational resources, which make them unsuitable for real-time applications. This paper proposes a lightweight, self-supervised hybrid learning framework for real-time PV fault diagnosis to address these limitations. First, the dataset is split into training, testing, and validation subsets. Thereafter, weighted class calculation steps are performed to overcome the issue of imbalance in the data. Then, a self-supervised pre-training phase is established to enable the encoder to produce effective internal representations prior to the implementation of a supervised fine-tuning classifier, characterized as a lightweight feed-forward network (Dense–Dropout–Dense Softmax), which will be trained using categorical cross-entropy and fault-type labels. Finally, a supervised fine-tuning stage is employed based on the pre-trained hybrid CNN–transformer encoder to perform PV fault classification. The experimental results indicate that the proposed approach outperforms existing models by achieving an overall accuracy of 99.8%, a recall of 99.6%, and an outstanding specificity of 100%. The confusion matrix demonstrates that classification is excellent on all operating types. Runtime analysis indicates that the model processes each sample in 2.78 ms and requires 0.07 MB to store weights of 19,429 parameters, confirming its suitability for real-time deployment. These findings highlight that using a hybrid CNN–Transformer encoder with self-supervised learning can improve fault detection and classification performance while significantly reducing inference time, making it an effective and efficient solution for intelligent PV system monitoring. Full article
(This article belongs to the Special Issue AI-Driven Control and Optimization in Power Electronics)
Show Figures

Figure 1

18 pages, 1285 KB  
Article
Research on the Cutting Path Control of Coal Mining Machine Based on Dynamic Geological Models
by Lin An and Yang Dai
Appl. Sci. 2026, 16(5), 2210; https://doi.org/10.3390/app16052210 - 25 Feb 2026
Abstract
Planned cutting is a core technique for intelligent coal mining, relying on high-precision geological models of fully mechanized mining faces to plan the cutting trajectory of mining equipment, with model accuracy as a prerequisite for intelligent mining. To address the limitations of traditional [...] Read more.
Planned cutting is a core technique for intelligent coal mining, relying on high-precision geological models of fully mechanized mining faces to plan the cutting trajectory of mining equipment, with model accuracy as a prerequisite for intelligent mining. To address the limitations of traditional interpolation methods in dynamic model updating and the technical gap between geological information and equipment control parameters, this study proposes a coal mining machine cutting path control method based on dynamic geological models. An improved smooth discrete interpolation method is developed to realize dynamic updating of the geological model, effectively improving the accuracy of local geological models and ensuring safe mining operations. Meanwhile, a method for converting geological information into coal mining equipment control parameters is proposed, breaking the technical barrier between geological data and production control information and laying a foundation for unmanned and intelligent mining. Field tests conducted in a shaft coal mine in Shaanxi demonstrate that the method achieves precise control of the coal mining machine’s trajectory: during a 7-day trial, the working face advanced 56 m and mined 51,000 tons of coal with minimal human intervention. Comparative analysis shows that the error between the planned cutting based on the dynamic geological model and manual cutting is within 10 cm, and the drum height curve is smoother, reducing frequent adjustments and facilitating equipment protection. Dynamic model updating ensures high accuracy, with an average absolute error of 0.029 m at 5 m from the update point and 0.101 m at 10 m, meeting the requirements for automated cutting. The successful application of this method verifies its feasibility in actual mining processes, providing a new technical approach for achieving unmanned and intelligent coal mining. Full article
13 pages, 996 KB  
Article
Comparative Methodology of Viscosity-Based Classification and Measurement Techniques for High-Temperature Behaviour of Paving Grade Bitumen
by Szabolcs Rosta, Zita Szabó and László Gáspár
Appl. Sci. 2026, 16(5), 2208; https://doi.org/10.3390/app16052208 (registering DOI) - 25 Feb 2026
Abstract
The accurate determination of the rheological properties of road bitumen types is essential for the reliable prediction of long-term pavement behaviour. At 60 °C, dynamic viscosity is a key rheological parameter characterising the shear-dependent viscoelastic behaviour of bitumen in the temperature range relevant [...] Read more.
The accurate determination of the rheological properties of road bitumen types is essential for the reliable prediction of long-term pavement behaviour. At 60 °C, dynamic viscosity is a key rheological parameter characterising the shear-dependent viscoelastic behaviour of bitumen in the temperature range relevant to in-service pavement loading. This study aims to compare different viscosity determination methods—approximations, capillary viscosity, Brookfield measurement and complex viscosity determined by a dynamic shear rheometer (DSR)—and to analyse their relationships with each other in order to find the best method for bitumen classification. Furthermore, the European and Australian bitumen classification standards are compared in terms of dynamic viscosity and penetration, according to which Australian bitumen types show more stable results, as the CV% is less than 10 percent. The study is based on the testing of Hungarian paving-grade bitumens (B50/70, B70/100) and Australian viscosity-graded bitumens (C170, C320), with the comparison of a total of 191 samples obtained from industrial production. The statistical evaluation of the results obtained with the different methods was based on Pearson correlation analysis and relative deviation analysis. The results indicate that the DSR measurement at 1.6 Hz shows the closest agreement with capillary viscosity, with a linear correlation coefficient of 0.95, and exhibits the strongest overall correlation with the other measurement approaches, whereas the Heukelom equation tends to overestimate the dynamic viscosity. The Brookfield method yielded higher viscosity values in all tests. The study highlights that the results of different measurement methods can only be compared under specific shear conditions, and a DSR-based approach can be more suitable for the introduction of a new European bitumen classification system. Full article
Show Figures

Figure 1

36 pages, 997 KB  
Article
Genetic Algorithms for Pareto Optimization in Bayesian Cournot Games Under Incomplete Cost Information
by David Carfí, Alessia Donato and Emanuele Perrone
Mathematics 2026, 14(5), 762; https://doi.org/10.3390/math14050762 - 25 Feb 2026
Abstract
This paper develops a practical computational framework for the Bayesian Cournot model with bilateral incomplete cost information, where each player is uncertain about the opponent’s marginal cost, drawn from a continuous compact interval [c*, c*] with [...] Read more.
This paper develops a practical computational framework for the Bayesian Cournot model with bilateral incomplete cost information, where each player is uncertain about the opponent’s marginal cost, drawn from a continuous compact interval [c*, c*] with 0<c*<c*<. The infinite dimensionality of the functional strategy spaces (mappings from types to production quantities) renders analytical closed-form solutions infeasible in this continuous-type setting. To overcome this challenge, we restrict the strategy spaces to finite-dimensional differentiable sub-manifolds—specifically, one-parameter families of oscillatory functions (cosine, sine, and mixed forms). After suitable affine Q-rescaling to map the oscillatory range into the production interval [0, Q], and with parameter ranges satisfying α, β>(π/2)/c*, these curves ensure near-exhaustivity: the joint production map (α, β)(xα(s), yβ(t)) covers [0, Q]2 densely for every fixed cost pair (s, t), thereby recovering (up to density and closure) the full ex-post payoff space. We introduce the ex-post payoff mapping Φ(s, t, x, y)=(es(x, y)(t), ft(x, y)(s)), which collects every realizable payoff pair once nature draws the types and players select their strategies. The image of Φ defines the general payoff space of the game, and its non-dominated points constitute the general ex-post Pareto frontier—all efficient realized outcomes across type-strategy realizations, without dependence on private probability measures over types. Using multi-objective genetic algorithms, we numerically approximate this frontier (and selected collusive compromises) within the restricted but representative sub-manifolds. The resulting frontiers are computationally accessible, robust to parameter variations, and validated through hypervolume convergence, sensitivity analysis, and comparisons with NSGA-II, PSO and scalarization methods. The findings are significant because they provide decision-makers in oligopolistic markets (e.g., electric vehicles) with viable, implementable production policies that explore efficient trade-offs under genuine cost uncertainty, without requiring explicit forecasts of the opponent’s type distribution—a limitation of traditional expected-utility approaches. By focusing on ex-post efficiency, the method reveals belief-independent compromise solutions that may guide tacit coordination or collusive outcomes in real-world strategic settings. Full article
(This article belongs to the Special Issue AI in Game Theory: Theory and Applications)
Show Figures

Figure 1

19 pages, 1725 KB  
Article
Management of Chemical Synthesis Processes of Potassium Humate During Coal Beneficiation Waste Processing
by Roman Dychkovskyi, Dariusz Sala, Michał Pyzalski, Ivan Miroshnykov, Agnieszka Sujak, Karol Durczak, Igor Kotsan and Andrii Pererva
Sustainability 2026, 18(5), 2196; https://doi.org/10.3390/su18052196 - 25 Feb 2026
Abstract
The growing accumulation of coal beneficiation waste represents a significant environmental and technological challenge while simultaneously creating opportunities for the resource recovery within circular economy frameworks. This study presents the development and process-oriented evaluation of an environmentally safe technology for converting coal beneficiation [...] Read more.
The growing accumulation of coal beneficiation waste represents a significant environmental and technological challenge while simultaneously creating opportunities for the resource recovery within circular economy frameworks. This study presents the development and process-oriented evaluation of an environmentally safe technology for converting coal beneficiation waste into potassium humate, with the simultaneous recovery of molybdenum compounds via alkaline extraction. The proposed solution is designed to improve resource efficiency, reduce the volume of waste directed to landfilling, and generate a high value-added product for agricultural and technological applications. The process flow includes preliminary characterization and preparation of the waste, determination of moisture, ash, and organic matter content, and the separation of metal-bearing fractions. Alkaline extraction was carried out using potassium hydroxide under controlled temperature and reaction time conditions, followed by purification and concentration of the humate solution. The process management strategy focuses on optimizing key technological parameters, including alkali concentration, solid-to-liquid ratio, temperature, and reaction time, to maximize humate yield while preserving functional groups responsible for biological activity. Comprehensive physicochemical, thermal, and mineralogical analyses confirmed the stability of the aluminosilicate matrix and the suitability of the material for alkaline processing without adverse structural degradation. Biological tests using oat (Avena sativa) demonstrated that potassium humate derived from coal beneficiation waste exhibits higher growth-stimulating effectiveness than a conventional commercial humate. Economic analysis revealed a strong correlation between humic acid content and added value, confirming the feasibility of transforming coal beneficiation waste from an environmental burden into a valuable secondary resource. Full article
(This article belongs to the Special Issue Waste Management Strategies for Clean Coal Technologies)
Show Figures

Figure 1

12 pages, 2167 KB  
Article
Revisiting the Origin of the Star-Forming Main Sequence Based on a Volume-Limited Sample of ∼25,000 Galaxies
by Yang Gao, Shujiao Liang, Qinghua Tan, Enci Wang, Huilan Liu, Hongmei Wang, Tao Jing, Xiaolong Wang, Kaihui Liu, Ning Gai, Yanke Tang, Yifan Wang and Yutong Li
Universe 2026, 12(3), 60; https://doi.org/10.3390/universe12030060 - 25 Feb 2026
Abstract
We revisit the extensively debated star-forming main sequence (SFMS)—a tight correlation between the star formation rate and stellar mass in both kiloparsec-resolved and integrated galaxies. We statistically explore the fundamental drivers of star formation at global scales, using a large volume-limited sample of [...] Read more.
We revisit the extensively debated star-forming main sequence (SFMS)—a tight correlation between the star formation rate and stellar mass in both kiloparsec-resolved and integrated galaxies. We statistically explore the fundamental drivers of star formation at global scales, using a large volume-limited sample of 24,954 local star-forming galaxies to overcome the limitations of previous works. Based on the mid-infrared 12 µm luminosity, stellar mass, and gr color, we estimate the molecular gas mass for the considered sample. At galaxy-wide scales, we establish global relations between the surface densities of the star formation rate (ΣSFR), stellar mass (Σ*), and molecular gas mass (Σmol). These global density relations are connected with and follow similar trends as the resolved SFMS, the Kennicutt–Schmidt (KS) relation, and the molecular gas main sequence (MGMS). Taking advantage of this large catalog, we show that the scatters in the global KS and MGMS relations are smaller than that of the global relation between ΣSFR and Σ*, and their Pearson correlation coefficients are higher. More importantly, multivariate regression and partial correlation analyses demonstrate that the apparent ΣSFRΣ* correlation is entirely mediated by Σmol, with its best-fit parameters directly derivable from those of the KS and MGMS relations. Overall, our findings suggest that the correlation between stellar mass and molecular gas, as well as that between molecular gas and star formation, are more direct and fundamental. The star-forming main sequence, thus, appears to be a natural by-product of these two tighter relations. Full article
(This article belongs to the Section Galaxies and Clusters)
Show Figures

Figure 1

24 pages, 2751 KB  
Article
Regression Analysis Under Interval-Valued Targets as an Imprecise Classification Problem
by Lev Utkin, Stanislav Kogan, Andrei Konstantinov and Vladimir Muliukha
Algorithms 2026, 19(3), 166; https://doi.org/10.3390/a19030166 - 24 Feb 2026
Abstract
Regression analysis with interval-valued outcomes presents a fundamental challenge in modeling data where uncertainty is inherent rather than incidental. Such data, arising naturally in fields ranging from meteorology to finance, require methods that preserve information about both central tendency and dispersion. We introduce [...] Read more.
Regression analysis with interval-valued outcomes presents a fundamental challenge in modeling data where uncertainty is inherent rather than incidental. Such data, arising naturally in fields ranging from meteorology to finance, require methods that preserve information about both central tendency and dispersion. We introduce a novel class of attention-based regression models that reformulates interval-valued regression as a multiclass classification task. The key idea behind the model is in partitioning the outcome domain into basic intervals derived from training data intersections and representing each interval-valued observation as a set of feasible discrete probability distributions over these intervals. This imprecise probabilistic representation allows us to train a classification-style model by minimizing the expected log-likelihood over all consistent distributions. We propose two training algorithms: a Monte Carlo sampling approach and a more efficient joint optimization method that simultaneously updates both the constrained probability distributions and model parameters. The model incorporates a kernel-based aggregation mechanism using trainable dot-product attention, where attention weights are computed from input features but applied to the probability distributions over basic intervals. Numerical experiments with real datasets illustrate the approach. By introducing the class of attention-based models for interval-valued regression, this work offers a novel perspective on applying machine learning to uncertain data. Codes implementing the proposed models are publicly available. Full article
(This article belongs to the Section Evolutionary Algorithms and Machine Learning)
Back to TopTop