Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (890)

Search Parameters:
Keywords = fit uncertainty

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
34 pages, 2542 KB  
Article
Uncertainty-Based Design Optimization Framework Based on Improved Chicken Swarm Algorithm and Bayesian Optimization Neural Network
by Qiang Ji, Ran Li and Shi Jing
Appl. Sci. 2025, 15(17), 9671; https://doi.org/10.3390/app15179671 (registering DOI) - 2 Sep 2025
Abstract
As the complexity and functional integration of mechanism systems continue to increase in modern practical engineering, the challenges of changing environmental conditions and extreme working conditions are becoming increasingly severe. Traditional uncertainty-based design optimization (UBDO) has exposed problems of low efficiency and slow [...] Read more.
As the complexity and functional integration of mechanism systems continue to increase in modern practical engineering, the challenges of changing environmental conditions and extreme working conditions are becoming increasingly severe. Traditional uncertainty-based design optimization (UBDO) has exposed problems of low efficiency and slow convergence when dealing with nonlinear, high-dimensional, and strongly coupled problems. In response to these issues, this paper proposes an UBDO framework that integrates an efficient intelligent optimization algorithm with an excellent surrogate model. By fusing butterfly search with Levy flight optimization, an improved chicken swarm algorithm is introduced, aiming to address the imbalance between global exploitation and local exploration capabilities in the original algorithm. Additionally, Bayesian optimization is employed to fit the limit-state evaluation function using a BP neural network, with the objective of reducing the high computational costs associated with uncertainty analysis through repeated limit-state evaluations in uncertainty-based optimization. Finally, a decoupled optimization framework is adopted to integrate uncertainty analysis with design optimization, enhancing global optimization capabilities under uncertainty and addressing challenges associated with results that lack sufficient accuracy or reliability to meet design requirements. Based on the results from engineering case studies, the proposed UBDO framework demonstrates notable effectiveness and superiority. Full article
(This article belongs to the Special Issue Data-Enhanced Engineering Structural Integrity Assessment and Design)
Show Figures

Figure 1

20 pages, 7108 KB  
Article
Improved Determination of Particle Backscattering Coefficient Using Four-Angle Volume Scattering Measurements
by Chang Han, Bangyi Tao, Yunzhou Li, Qingjun Song, Haiqing Huang and Zhihua Mao
Remote Sens. 2025, 17(17), 2990; https://doi.org/10.3390/rs17172990 - 28 Aug 2025
Viewed by 164
Abstract
The backscattering coefficient of aquatic particles (bbp(λ)) is one of the most important inherent optical properties in remote sensing. Due to the practical difficulties associated with measurements of the volume scattering function (VSF) over the whole [...] Read more.
The backscattering coefficient of aquatic particles (bbp(λ)) is one of the most important inherent optical properties in remote sensing. Due to the practical difficulties associated with measurements of the volume scattering function (VSF) over the whole backward hemisphere (90°–180°), bbp(λ) is estimated using either a single-angle approach, which employs the VSF at a fixed angle multiplied by a conversion factor χp(θ;λ), or a multi-angle approach, which uses the VSF at multiple angles with polynomial fitting. The angular variation in the VSF in the backward angles introduces uncertainties into bbp(λ) estimation. In this study, 178 VSF datasets from the global ocean were investigated. χp exhibited wavelength, regional, and angular variations. Although χp exhibited the lowest variability, at 120° (χp(120°;λ)), the single-angle approach exhibited a 12.71% mean absolute percent difference (MAPD) and a root mean squared error (RMSE) of approximately 4.02×103m1. χp(140°;λ) exhibited larger variations at different wavelengths and in coastal regions. The three-angle approach exhibits wavelength independence and lower uncertainties, but the uncertainty of the polynomial fitting results at angles greater than 150° is relatively large, and the MAPD is still up to 10.92%. A better four-angle approach (100°, 120°, 140°, and 160°) was proposed, which could accurately determine bbp(λ) with the lowest MAPD (3.12%) and RMSE (0.86×103m1). Notably, expanding to five angles provided minimal additional improvements, with the reduction in the MAPD being less than 1% compared to that under the four-angle approach. This study provides valuable insights into developing advanced optical sensors with better angular configurations for measuring bbp(λ). Full article
(This article belongs to the Section Earth Observation Data)
Show Figures

Figure 1

23 pages, 5401 KB  
Article
Accelerating Thermally Safe Operating Area Assessment of Ignition Coils for Hydrogen Engines via AI-Driven Power Loss Estimation
by Federico Ricci, Mario Picerno, Massimiliano Avana, Stefano Papi, Federico Tardini and Massimo Dal Re
Vehicles 2025, 7(3), 90; https://doi.org/10.3390/vehicles7030090 - 25 Aug 2025
Viewed by 301
Abstract
In order to determine thermally safe driving parameters of ignition coils for hydrogen internal combustion engines (ICE), a reliable estimation of internal power losses is essential. These losses include resistive winding losses, magnetic core losses due to hysteresis and eddy currents, dielectric losses [...] Read more.
In order to determine thermally safe driving parameters of ignition coils for hydrogen internal combustion engines (ICE), a reliable estimation of internal power losses is essential. These losses include resistive winding losses, magnetic core losses due to hysteresis and eddy currents, dielectric losses in the insulation, and electronic switching losses. Direct experimental assessment is difficult because the components are inaccessible, while conventional computer-aided engineering (CAE) approaches face challenges such as the need for accurate input data, the need for detailed 3D models, long computation times, and uncertainties in loss prediction for complex structures. To address these limitations, we propose an artificial intelligence (AI)-based framework for estimating internal losses from external temperature measurements. The method relies on an artificial neural network (ANN), trained to capture the relationship between external coil temperatures and internal power losses. The trained model is then employed within an optimization process to identify losses corresponding to experimental temperature values. Validation is performed by introducing the identified power losses into a CAE thermal model to compare predicted and experimental temperatures. The results show excellent agreement, with errors below 3% across the −30 °C to 125 °C range. This demonstrates that the proposed hybrid ANN–CAE approach achieves high accuracy while reducing experimental effort and computational demand. Furthermore, the methodology allows for a straightforward determination of the coil safe operating area (SOA). Starting from estimates derived from fitted linear trends, the SOA limits can be efficiently refined through iterative verification with the CAE model. Overall, the ANN–CAE framework provides a robust and practical tool to accelerate thermal analysis and support coil development for hydrogen ICE applications. Full article
Show Figures

Figure 1

32 pages, 15059 KB  
Article
Impact of Land Use Patterns on Flood Risk in the Chang-Zhu-Tan Urban Agglomeration, China
by Ting Zhang, Kai Wu, Xiulian Wang, Xinai Li, Long Li and Longqian Chen
Remote Sens. 2025, 17(16), 2889; https://doi.org/10.3390/rs17162889 - 19 Aug 2025
Viewed by 632
Abstract
Flood risk assessment is an effective tool for disaster prevention and mitigation. As land use is a key factor influencing flood disasters, studying the impact of different land use patterns on flood risk is crucial. This study evaluates flood risk in the Chang-Zhu-Tan [...] Read more.
Flood risk assessment is an effective tool for disaster prevention and mitigation. As land use is a key factor influencing flood disasters, studying the impact of different land use patterns on flood risk is crucial. This study evaluates flood risk in the Chang-Zhu-Tan (CZT) urban agglomeration by selecting 17 socioeconomic and natural environmental factors within a risk assessment framework encompassing hazard, exposure, vulnerability, and resilience. Additionally, the Patch-Generating Land Use Simulation (PLUS) and multilayer perceptron (MLP)/Bayesian network (BN) models were coupled to predict flood risks under three future land use scenarios: natural development, urban construction, and ecological protection. This integrated modeling framework combines MLP’s high-precision nonlinear fitting with BN’s probabilistic inference, effectively mitigating prediction uncertainty in traditional single-model approaches while preserving predictive accuracy and enhancing causal interpretability. The results indicate that high-risk flood zones are predominantly concentrated along the Xiang River, while medium-high- and medium-risk areas are mainly distributed on the periphery of high-risk zones, exhibiting a gradient decline. Low-risk areas are scattered in mountainous regions far from socioeconomic activities. Simulating future land use using the PLUS model with a Kappa coefficient of 0.78 and an overall accuracy of 0.87. Under all future scenarios, cropland decreases while construction land increases. Forestland decreases in all scenarios except for ecological protection, where it expands. In future risk predictions, the MLP model achieved a high accuracy of 97.83%, while the BN model reached 87.14%. Both models consistently indicated that the flood risk was minimized under the ecological protection scenario and maximized under the urban construction scenario. Therefore, adopting ecological protection measures can effectively mitigate flood risks, offering valuable guidance for future disaster prevention and mitigation strategies. Full article
Show Figures

Figure 1

34 pages, 2970 KB  
Article
Combined Particle Swarm Optimization and Reinforcement Learning for Water Level Control in a Reservoir
by Oana Niculescu-Faida and Catalin Popescu
Sensors 2025, 25(16), 5055; https://doi.org/10.3390/s25165055 - 14 Aug 2025
Viewed by 387
Abstract
This article focuses on the research and advancement of an optimal system for the automatic regulation of the water level in a reservoir to eliminate flooding in the area where it is located. For example, in this article, the regulation of the level [...] Read more.
This article focuses on the research and advancement of an optimal system for the automatic regulation of the water level in a reservoir to eliminate flooding in the area where it is located. For example, in this article, the regulation of the level in the Mariselu Reservoir from the dam in Bistrita–Nasaud County, Romania, was considered as a practical application. Industrial PID controller tuning provides robust and stable solutions; however, the controller parameters may require frequent tuning owing to uncertainties and changes in operating conditions. Considering this inconvenience, an adaptive adjustment of the PID controller parameters is necessary, combining various parameter optimization methods, namely reinforcement learning and Particle Swarm Optimization. A new optimization method was developed that uses a mathematical equation to guide the Particle Swarm Optimization method, which in essence enhances the fitness function of reinforcement learning, thus obtaining a control system that combines the advantages of the two methods and minimizes their disadvantages. The method was tested by simulation using MATLAB and Python, obtaining very good results, after which it was implemented, which successfully prevented floods in the area where it was placed. This optimal automation system for dams should be implemented and adapted for several dams in Romania Full article
(This article belongs to the Special Issue Intelligent Industrial Process Control Systems: 2nd Edition)
Show Figures

Figure 1

23 pages, 3705 KB  
Article
Determination of Trends in GPS Time Series Using Complementary Ensemble Empirical Mode Decomposition
by Agnieszka Wnęk and Dawid Kudas
Remote Sens. 2025, 17(16), 2802; https://doi.org/10.3390/rs17162802 - 13 Aug 2025
Viewed by 327
Abstract
Time series of Global Positioning System (GPS) station positions include signals whose characteristics vary over time. Therefore, in detailed analyses, methods dedicated to nonstationary time series should be used. In this study, the Complementary Ensemble Empirical Mode Decomposition (CEEMD) method was used to [...] Read more.
Time series of Global Positioning System (GPS) station positions include signals whose characteristics vary over time. Therefore, in detailed analyses, methods dedicated to nonstationary time series should be used. In this study, the Complementary Ensemble Empirical Mode Decomposition (CEEMD) method was used to model trends in time series of GPS station positions and to verify their nonlinearity. As the CEEMD method does not provide equations for assessing the uncertainty of the determined trend, we propose to use the bootstrap method for this purpose. In this study, daily time series of the Up components of 25 GPS stations from seismic regions in Europe and from the Nevada Geodetic Laboratory (NGL) service were used. The determined trends were compared with the trends calculated using the Singular Spectrum Analysis (SSA) method and then interpreted in the context of earthquakes occurring in the vicinity of the station. In turn, the bootstrap method was used to estimate the mean standard deviations of trends determined by the CEEMD as well as SSA method. The conducted studies showed the usefulness of the CEEMD method for modeling trends in time series of GPS station positions, especially for stations where changes may occur on short time scales, visible as the local nonlinearity of the trend, mainly due to earthquake events. The bootstrap-estimated mean standard deviation values for the modeled nonlinear trends are at the level of 1–3 mm, depending on the station. In turn, the root mean square error (RMSE) estimated between the nonlinear trend determined by the CEEMD method and the linear trend fitted to it by the least squares method does not exceed 3 mm. The conducted research indicates that the CEEMD method can be successfully used to model locally nonlinear trends resulting from earthquakes, and the mean standard deviation of the estimated trends is relatively low. Full article
(This article belongs to the Special Issue Advances in GNSS for Time Series Analysis)
Show Figures

Figure 1

25 pages, 4654 KB  
Article
Modeling Herbaceous Biomass and Assessing Degradation Risk in the Caatinga Biome Using Monte Carlo Simulation
by Jefta Ruama de Oliveira Figueiredo, José Morais Pereira Filho, Jefferson Ferreira de Freitas Feitosa, Magno José Duarte Cândido, Sonel Gilles, Olaf Andreas Bakke, Samuel Rocha Maranhão, Ana Clara Rodrigues Cavalcante, Ricardo Loiola Edvan and Leilson Rocha Bezerra
Sustainability 2025, 17(16), 7267; https://doi.org/10.3390/su17167267 - 12 Aug 2025
Viewed by 335
Abstract
Simulating scenarios under climate change is essential to understanding vegetation dynamics, ensuring the survival of forage species, and minimizing uncertainties in project costs and timelines. This study aimed to simulate historical probabilities and develop a biomass production model using PHYGROW software (Texas A&M [...] Read more.
Simulating scenarios under climate change is essential to understanding vegetation dynamics, ensuring the survival of forage species, and minimizing uncertainties in project costs and timelines. This study aimed to simulate historical probabilities and develop a biomass production model using PHYGROW software (Texas A&M University, College Station, TX, USA), combined with Monte Carlo Simulation (MCS) in the @RISK program (Ithaca, NY, USA), to evaluate long-term biomass production in a native pasture area of the Caatinga biome. The results show strong agreement between software estimates and field data. For 2016, PHYGROW estimated 883 kg/ha, while field measurements reached 836.8 kg/ha; for 2017, 1117 kg/ha was estimated, while 992.15 kg/ha was observed. For 2018, the model estimated 1200 kg/ha compared with 1763.5 kg/ha in the field, and for 2019, 1230 kg/ha was estimated versus the 1294.3 kg/ha observed. The Monte Carlo simulations indicated that the Weibull distribution best fitted the synthetic series, with 90% adherence. Biomass production values ranged from 618 to 1427 kg/ha with a 90% probability. Only 5% of the simulations projected values below 600 kg/ha or above 1400 kg/ha. Moreover, there was a 95% risk of production issues if planning was based on biomass values above 1000 kg/ha. These findings highlight PHYGROW’s potential for pasture management under semi-arid conditions for predicting and avoiding degradation scenarios that could even lead to areas of desertification. Full article
Show Figures

Figure 1

24 pages, 789 KB  
Article
Seeing Is Believing: The Impact of AI Magic Mirror on Consumer Purchase Intentions in Medical Aesthetic Services
by Yu Li, Chujun Zhang, Tian Shen and Xi Chen
J. Theor. Appl. Electron. Commer. Res. 2025, 20(3), 205; https://doi.org/10.3390/jtaer20030205 - 7 Aug 2025
Viewed by 597
Abstract
The integration of AI into online platforms is reshaping consumer experience and behavior. While existing research has largely focused on the role of AI in search services and experience services, few studies have examined the role of AI in the context of credence [...] Read more.
The integration of AI into online platforms is reshaping consumer experience and behavior. While existing research has largely focused on the role of AI in search services and experience services, few studies have examined the role of AI in the context of credence services. This study fills this gap by investigating an AI-powered preview tool in the context of online medical aesthetic platforms. Specifically, this study investigates how the AI Magic Mirror influences consumer purchase intentions in medical aesthetic services. Using secondary data analysis and two experimental studies, we examine the main effects, as well as mediation and moderation effects. The findings consistently demonstrate that the AI Magic Mirror significantly increases consumer purchase intentions. This relationship is positively mediated by perceived value and negatively mediated by perceived risk. In addition, the main effect is stronger for procedures with higher fit uncertainty and is more pronounced for those with lower popularity. These results provide theoretical insights into AI application in credence service contexts and offer practical implications for the design of AI-enhanced online service platforms. Full article
Show Figures

Figure 1

16 pages, 7134 KB  
Article
The Impact of an Object’s Surface Material and Preparatory Actions on the Accuracy of Optical Coordinate Measurement
by Danuta Owczarek, Ksenia Ostrowska, Jerzy Sładek, Adam Gąska, Wiktor Harmatys, Krzysztof Tomczyk, Danijela Ignjatović and Marek Sieja
Materials 2025, 18(15), 3693; https://doi.org/10.3390/ma18153693 - 6 Aug 2025
Viewed by 431
Abstract
Optical coordinate measurement is a universal technique that aligns with the rapid development of industrial technologies and new materials. Nevertheless, can this technique be consistently effective when applied to the precise measurement of all types of materials? As shown in this article, an [...] Read more.
Optical coordinate measurement is a universal technique that aligns with the rapid development of industrial technologies and new materials. Nevertheless, can this technique be consistently effective when applied to the precise measurement of all types of materials? As shown in this article, an analysis of optical measurement systems reveals that some materials cause difficulties during the scanning process. This article details the matting process, resulting, as demonstrated, in lower measurement uncertainty values compared to the pre-matting state, and identifies materials for which applying a matting spray significantly improves the measurement quality. The authors propose a classification of materials into easy-to-scan and hard-to-scan groups, along with specific procedures to improve measurements, especially for the latter. Tests were conducted in an accredited Laboratory of Coordinate Metrology using an articulated arm with a laser probe. Measured objects included spheres made of ceramic, tungsten carbide (including a matte finish), aluminum oxide, titanium nitride-coated steel, and photopolymer resin, with reference diameters established by a high-precision Leitz PMM 12106 coordinate measuring machine. Diameters were determined from point clouds obtained via optical measurements using the best-fit method, both before and after matting. Color measurements using a spectrocolorimeter supplemented this study to assess the effect of matting on surface color. The results revealed correlations between the material type and measurement accuracy. Full article
(This article belongs to the Section Optical and Photonic Materials)
Show Figures

Figure 1

24 pages, 3291 KB  
Article
Machine Learning Subjective Opinions: An Application in Forensic Chemistry
by Anuradha Akmeemana and Michael E. Sigman
Algorithms 2025, 18(8), 482; https://doi.org/10.3390/a18080482 - 4 Aug 2025
Viewed by 353
Abstract
Simulated data created in silico using a previously reported method were sampled by bootstrapping to generate data sets for training multiple copies of an ensemble learner (i.e., a machine learning (ML) method). The posterior probabilities of class membership obtained by applying the ensemble [...] Read more.
Simulated data created in silico using a previously reported method were sampled by bootstrapping to generate data sets for training multiple copies of an ensemble learner (i.e., a machine learning (ML) method). The posterior probabilities of class membership obtained by applying the ensemble of ML models to previously unseen validation data were fitted to a beta distribution. The shape parameters for the fitted distribution were used to calculate the subjective opinion of sample membership into one of two mutually exclusive classes. The subjective opinion consists of belief, disbelief and uncertainty masses. A subjective opinion for each validation sample allows identification of high-uncertainty predictions. The projected probabilities of the validation opinions were used to calculate log-likelihood ratio scores and generate receiver operating characteristic (ROC) curves from which an opinion-supported decision can be made. Three very different ML models, linear discriminant analysis (LDA), random forest (RF), and support vector machines (SVM) were applied to the two-state classification problem in the analysis of forensic fire debris samples. For each ML method, a set of 100 ML models was trained on data sets bootstrapped from 60,000 in silico samples. The impact of training data set size on opinion uncertainty and ROC area under the curve (AUC) were studied. The median uncertainty for the validation data was smallest for LDA ML and largest for the SVM ML. The median uncertainty continually decreased as the size of the training data set increased for all ML.The AUC for ROC curves based on projected probabilities was largest for the RF model and smallest for the LDA method. The ROC AUC was statistically unchanged for LDA at training data sets exceeding 200 samples; however, the AUC increased with increasing sample size for the RF and SVM methods. The SVM method, the slowest to train, was limited to a maximum of 20,000 training samples. All three ML methods showed increasing performance when the validation data was limited to higher ignitable liquid contributions. An ensemble of 100 RF ML models, each trained on 60,000 in silico samples, performed the best with a median uncertainty of 1.39x102 and ROC AUC of 0.849 for all validation samples. Full article
(This article belongs to the Special Issue Artificial Intelligence in Modeling and Simulation (2nd Edition))
Show Figures

Graphical abstract

23 pages, 7257 KB  
Article
The Development and Statistical Analysis of a Material Strength Database of Existing Italian Prestressed Concrete Bridges
by Michele D’Amato, Antonella Ranaldo, Monica Rosciano, Alessandro Zona, Michele Morici, Laura Gioiella, Fabio Micozzi, Alberto Poeta, Virginio Quaglini, Sara Cattaneo, Dalila Rossi, Carlo Pettorruso, Walter Salvatore, Agnese Natali, Simone Celati, Filippo Ubertini, Ilaria Venanzi, Valentina Giglioni, Laura Ierimonti, Andrea Meoni, Michele Titton, Paola Pannuzzo and Andrea Dall’Astaadd Show full author list remove Hide full author list
Infrastructures 2025, 10(8), 203; https://doi.org/10.3390/infrastructures10080203 - 2 Aug 2025
Viewed by 646
Abstract
This paper reports a statistical analysis of a database archiving information on the strengths of the materials in existing Italian bridges having pre- and post-tensioned concrete beams. Data were collected in anonymous form by analyzing a stock of about 170 bridges built between [...] Read more.
This paper reports a statistical analysis of a database archiving information on the strengths of the materials in existing Italian bridges having pre- and post-tensioned concrete beams. Data were collected in anonymous form by analyzing a stock of about 170 bridges built between 1960 and 2000 and located in several Italian regions. To date, the database refers to steel reinforcing bars, concrete, and prestressing steel, whose strengths were gathered from design nominal values, acceptance certificates, and in situ test results, all derived by consulting the available documents for each examined bridge. At first, this paper describes how the available data were collected. Then, the results of a statistical analysis are presented and commented on. Moreover, goodness-of-fit tests are carried out to verify the assumption validity of a normal distribution for steel reinforcing bars and prestressing steel, and a log-normal distribution for concrete. The database represents a valuable resource for researchers and practitioners for the assessment of existing bridges. It may be applied for the use of prior knowledge within a framework where Bayesian methods are included for reducing uncertainties. The database provides essential information on the strengths of the materials to be used for a simulated design and/or for verification in the case of limited knowledge. Goodness-of-fit tests make the collected information very useful, even if probabilistic methods are applied. Full article
(This article belongs to the Section Infrastructures and Structural Engineering)
Show Figures

Figure 1

18 pages, 3354 KB  
Article
Hydrological Modeling of the Chikugo River Basin Using SWAT: Insights into Water Balance and Seasonal Variability
by Francis Jhun Macalam, Kunyang Wang, Shin-ichi Onodera, Mitsuyo Saito, Yuko Nagano, Masatoshi Yamazaki and Yu War Nang
Sustainability 2025, 17(15), 7027; https://doi.org/10.3390/su17157027 - 2 Aug 2025
Viewed by 738
Abstract
Integrated hydrological modeling plays a crucial role in advancing sustainable water resource management, particularly in regions facing seasonal and extreme precipitation events. However, comprehensive studies that assess hydrological variability in temperate river basins remain limited. This study addresses this gap by evaluating the [...] Read more.
Integrated hydrological modeling plays a crucial role in advancing sustainable water resource management, particularly in regions facing seasonal and extreme precipitation events. However, comprehensive studies that assess hydrological variability in temperate river basins remain limited. This study addresses this gap by evaluating the performance of the Soil and Water Assessment Tool (SWAT) in simulating streamflow, water balance, and seasonal hydrological dynamics in the Chikugo River Basin, Kyushu Island, Japan. The basin, originating from Mount Aso and draining into the Ariake Sea, is subject to frequent typhoons and intense rainfall, making it a critical case for sustainable water governance. Using the Sequential Uncertainty Fitting Version 2 (SUFI-2) approach, we calibrated the SWAT model over the period 2007–2021. Water balance analysis revealed that baseflow plays dominant roles in basin hydrology which is essential for agricultural and domestic water needs by providing a stable groundwater contribution despite increasing precipitation and varying water demand. These findings contribute to a deeper understanding of hydrological behavior in temperate catchments and offer a scientific foundation for sustainable water allocation, planning, and climate resilience strategies. Full article
Show Figures

Figure 1

19 pages, 3818 KB  
Article
Robotic Arm Trajectory Planning in Dynamic Environments Based on Self-Optimizing Replay Mechanism
by Pengyao Xu, Chong Di, Jiandong Lv, Peng Zhao, Chao Chen and Ruotong Wang
Sensors 2025, 25(15), 4681; https://doi.org/10.3390/s25154681 - 29 Jul 2025
Viewed by 591
Abstract
In complex dynamic environments, robotic arms face multiple challenges such as real-time environmental changes, high-dimensional state spaces, and strong uncertainties. Trajectory planning tasks based on deep reinforcement learning (DRL) suffer from difficulties in acquiring human expert strategies, low experience utilization (leading to slow [...] Read more.
In complex dynamic environments, robotic arms face multiple challenges such as real-time environmental changes, high-dimensional state spaces, and strong uncertainties. Trajectory planning tasks based on deep reinforcement learning (DRL) suffer from difficulties in acquiring human expert strategies, low experience utilization (leading to slow convergence), and unreasonable reward function design. To address these issues, this paper designs a neural network-based expert-guided triple experience replay mechanism (NETM) and proposes an improved reward function adapted to dynamic environments. This replay mechanism integrates imitation learning’s fast data fitting with DRL’s self-optimization to expand limited expert demonstrations and algorithm-generated successes into optimized expert experiences. Experimental results show the expanded expert experience accelerates convergence: in dynamic scenarios, NETM boosts accuracy by over 30% and safe rate by 2.28% compared to baseline algorithms. Full article
(This article belongs to the Section Sensors and Robotics)
Show Figures

Figure 1

19 pages, 2311 KB  
Article
Stochastic Optimization of Quality Assurance Systems in Manufacturing: Integrating Robust and Probabilistic Models for Enhanced Process Performance and Product Reliability
by Kehinde Afolabi, Busola Akintayo, Olubayo Babatunde, Uthman Abiola Kareem, John Ogbemhe, Desmond Ighravwe and Olanrewaju Oludolapo
J. Manuf. Mater. Process. 2025, 9(8), 250; https://doi.org/10.3390/jmmp9080250 - 23 Jul 2025
Viewed by 628
Abstract
This research integrates stochastic optimization techniques with robust modeling and probabilistic modeling approaches to enhance photovoltaic cell manufacturing processes and product reliability. The study employed an adapted genetic algorithm to tackle uncertainties in the manufacturing process, resulting in improved operational efficiency. It consistently [...] Read more.
This research integrates stochastic optimization techniques with robust modeling and probabilistic modeling approaches to enhance photovoltaic cell manufacturing processes and product reliability. The study employed an adapted genetic algorithm to tackle uncertainties in the manufacturing process, resulting in improved operational efficiency. It consistently achieved optimal fitness, with values remaining at 1.0 over 100 generations. The model displayed a dynamic convergence rate, demonstrating its ability to adjust performance in response to process fluctuations. The system preserved resource efficiency by utilizing approximately 2600 units per generation, while minimizing machine downtime to 0.03%. Product reliability reached an average level of 0.98, with a maximum value of 1.02, indicating enhanced consistency. The manufacturing process achieved better optimization through a significant reduction in defect rates, which fell to 0.04. The objective function value fluctuated between 0.86 and 0.96, illustrating how the model effectively managed conflicting variables. Sensitivity analysis revealed that changes in sigma material and lambda failure had a minimal effect on average reliability, which stayed above 0.99, while average defect rates remained below 0.05. This research exemplifies how stochastic, robust, and probabilistic optimization methods can collaborate to enhance manufacturing system quality assurance and product reliability under uncertain conditions. Full article
Show Figures

Figure 1

21 pages, 2049 KB  
Article
Tracking Lava Flow Cooling from Space: Implications for Erupted Volume Estimation and Cooling Mechanisms
by Simone Aveni, Gaetana Ganci, Andrew J. L. Harris and Diego Coppola
Remote Sens. 2025, 17(15), 2543; https://doi.org/10.3390/rs17152543 - 22 Jul 2025
Viewed by 1398
Abstract
Accurate estimation of erupted lava volumes is essential for understanding volcanic processes, interpreting eruptive cycles, and assessing volcanic hazards. Traditional methods based on Mid-Infrared (MIR) satellite imagery require clear-sky conditions during eruptions and are prone to sensor saturation, limiting data availability. Here, we [...] Read more.
Accurate estimation of erupted lava volumes is essential for understanding volcanic processes, interpreting eruptive cycles, and assessing volcanic hazards. Traditional methods based on Mid-Infrared (MIR) satellite imagery require clear-sky conditions during eruptions and are prone to sensor saturation, limiting data availability. Here, we present an alternative approach based on the post-eruptive Thermal InfraRed (TIR) signal, using the recently proposed VRPTIR method to quantify radiative energy loss during lava flow cooling. We identify thermally anomalous pixels in VIIRS I5 scenes (11.45 µm, 375 m resolution) using the TIRVolcH algorithm, this allowing the detection of subtle thermal anomalies throughout the cooling phase, and retrieve lava flow area by fitting theoretical cooling curves to observed VRPTIR time series. Collating a dataset of 191 mafic eruptions that occurred between 2010 and 2025 at (i) Etna and Stromboli (Italy); (ii) Piton de la Fournaise (France); (iii) Bárðarbunga, Fagradalsfjall, and Sundhnúkagígar (Iceland); (iv) Kīlauea and Mauna Loa (United States); (v) Wolf, Fernandina, and Sierra Negra (Ecuador); (vi) Nyamuragira and Nyiragongo (DRC); (vii) Fogo (Cape Verde); and (viii) La Palma (Spain), we derive a new power-law equation describing mafic lava flow thickening as a function of time across five orders of magnitude (from 0.02 Mm3 to 5.5 km3). Finally, from knowledge of areas and episode durations, we estimate erupted volumes. The method is validated against 68 eruptions with known volumes, yielding high agreement (R2 = 0.947; ρ = 0.96; MAPE = 28.60%), a negligible bias (MPE = −0.85%), and uncertainties within ±50%. Application to the February-March 2025 Etna eruption further corroborates the robustness of our workflow, from which we estimate a bulk erupted volume of 4.23 ± 2.12 × 106 m3, in close agreement with preliminary estimates from independent data. Beyond volume estimation, we show that VRPTIR cooling curves follow a consistent decay pattern that aligns with established theoretical thermal models, indicating a stable conductive regime during the cooling stage. This scale-invariant pattern suggests that crustal insulation and heat transfer across a solidifying boundary govern the thermal evolution of cooling basaltic flows. Full article
Show Figures

Figure 1

Back to TopTop