Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (803)

Search Parameters:
Keywords = event filtering

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
16 pages, 992 KB  
Article
Operational Speed in Skidding Operations by Cable Skidders and Farm Tractors: Results of a Nationwide Assessment
by Monica Cecilia Zurita Vintimilla and Stelian Alexandru Borz
Appl. Sci. 2025, 15(18), 9921; https://doi.org/10.3390/app15189921 - 10 Sep 2025
Abstract
Accurate estimates of operational speed are crucial for modeling skidding productivity and planning efficient timber extraction. This study provides an event-level characterization of operational speeds in timber skidding operations in Romania, comparing cable skidders and farm tractors. Unlike most previous studies, which are [...] Read more.
Accurate estimates of operational speed are crucial for modeling skidding productivity and planning efficient timber extraction. This study provides an event-level characterization of operational speeds in timber skidding operations in Romania, comparing cable skidders and farm tractors. Unlike most previous studies, which are based on limited datasets, this research uses a large, diverse dataset obtained through GNSS tracking over 98 field days at 14 sites, supplemented by synchronized video recordings. A total of 1.74 million seconds of data were collected, with 1.20 million seconds retained for analysis after data quality filtering. Descriptive statistics and Mann–Whitney U tests revealed significant differences in speed. For cable skidders, median speeds ranged from 1.6 km/h during maneuvering at the pre-skidding site to 5.0 km/h during unloaded driving to the pre-skidding site. For farm tractors, median speeds ranged from 2.2 km/h during maneuvering on the forest road to 6.0 km/h when driving unloaded to the pre-skidding site. The highest speeds were observed during unloaded driving, while the lowest occurred during maneuvering. Surprisingly, farm tractors outperformed cable skidders in some operational events due to more favorable terrain. The findings document GNSS-derived speed as a sufficiently reliable proxy for machine performance assessment and provide robust data for predictive modeling, operational planning, and equipment selection in forestry. Full article
21 pages, 33616 KB  
Article
CycloneWind: A Dynamics-Constrained Deep Learning Model for Tropical Cyclone Wind Field Downscaling Using Satellite Observations
by Yuxiang Hu, Kefeng Deng, Qingguo Su, Di Zhang, Xinjie Shi and Kaijun Ren
Remote Sens. 2025, 17(18), 3134; https://doi.org/10.3390/rs17183134 - 10 Sep 2025
Abstract
Tropical cyclones (TCs) rank among the most destructive natural hazards globally, with core damaging potential originating from regions of intense wind shear and steep wind speed gradients within the eyewall and spiral rainbands. Accurately characterizing these fine-scale structural features is therefore critical for [...] Read more.
Tropical cyclones (TCs) rank among the most destructive natural hazards globally, with core damaging potential originating from regions of intense wind shear and steep wind speed gradients within the eyewall and spiral rainbands. Accurately characterizing these fine-scale structural features is therefore critical for understanding TC intensity evolution, wind hazard distribution, and disaster mitigation. Recently, the deep learning-based downscaling methods have shown significant advantages in efficiently obtaining high-resolution wind field distributions. However, existing methods are mainly used to downscale general wind fields, and research on downscaling extreme wind field events remains limited. There are two main difficulties in downscaling TC wind fields. The first one is that high-quality datasets for TC wind fields are scarce; the other is that general deep learning frameworks lack the ability to capture the dynamic characteristics of TCs. Consequently, this study proposes a novel deep learning framework, CycloneWind, for downscaling TC surface wind fields: (1) a high-quality dataset is constructed by integrating Cyclobs satellite observations with ERA5 reanalysis data, incorporating auxiliary variables like low cloud cover, surface pressure, and top-of-atmosphere incident solar radiation; (2) we propose CycloneWind, a dynamically constrained Transformer-based architecture incorporating three wind field dynamical operators, along with a wind dynamics-constrained loss function formulated to enforce consistency in wind divergence and vorticity; (3) an Adaptive Dynamics-Guided Block (ADGB) is designed to explicitly encode TC rotational dynamics using wind shear detection and wind vortex diffusion operators; (4) Filtering Transformer Layers (FTLs) with high-frequency filtering operators are used for modeling wind field small-scale details. Experimental results demonstrate that CycloneWind successfully achieves an 8-fold spatial resolution reconstruction in TC regions. Compared to the best-performing baseline model, CycloneWind reduces the Root Mean Square Error (RMSE) for the U and V wind components by 9.6% and 4.9%, respectively. More significantly, it achieves substantial improvements of 23.0%, 22.6%, and 20.5% in key dynamical metrics such as divergence difference, vorticity difference, and direction cosine dissimilarity. Full article
Show Figures

Figure 1

14 pages, 398 KB  
Review
Hemoadsorption in Children with Cytokine Storm Using the Jafron HA330 and HA380 Cartridges
by Kamila Azenova and Vitaliy Sazonov
J. Clin. Med. 2025, 14(18), 6359; https://doi.org/10.3390/jcm14186359 - 9 Sep 2025
Abstract
Background: A cytokine storm can lead to organ dysfunction and death in critically ill children. Extracorporeal hemoperfusion aims to reduce hyperinflammation by filtering out mid-range cytokines (e.g., IL-6), but pediatric data remain limited. Methods: We conducted a narrative review with PRISMA-guided screening of [...] Read more.
Background: A cytokine storm can lead to organ dysfunction and death in critically ill children. Extracorporeal hemoperfusion aims to reduce hyperinflammation by filtering out mid-range cytokines (e.g., IL-6), but pediatric data remain limited. Methods: We conducted a narrative review with PRISMA-guided screening of PubMed, Scopus, and Google Scholar for pediatric reports of HA330/HA380 from January 2020 to June 2025. Due to heterogeneity in populations, circuits, and outcome timing, the results were synthesized descriptively. Three studies met the inclusion criteria: a prospective series of 12 patients with septic shock using HA330, a single case of a pediatric heart transplant with HA380 during cardiopulmonary bypass, and a retrospective comparative cohort study of Pediatric Intensive Care Unit (PICU) oncology patients on continuous renal replacement therapy (CRRT) comparing HA330 (n = 11) versus CytoSorb (n = 10). Results: Three studies involving 23 pediatric patients were analyzed. The median age was 8 years, and 56.5% of patients were male. Most patients underwent hemoadsorption with HA330 via continuous renal replacement therapy (CRRT) or continuous venovenous hemodiafiltration (CVVHDF). Post-treatment reductions were noted in interleukin-6 (IL-6) (mean −69.6%), C-reactive protein (CRP) (−59.0%), and procalcitonin (PCT) (−70.4%). Severity scores (Pediatric Logistic Organ Dysfunction-2 (PELOD-2), Pediatric Risk of Mortality-3 (PRISM-3), and Pediatric Sequential Organ Failure Assessment (pSOFA) improved significantly (p = 0.002). The mean PICU stay was 15.6 days. The survival rate was 87%, and no hemoadsorption-related adverse events were reported. Conclusions: HA330/380 hemoadsorption is a safe and potentially effective treatment for pediatric cytokine storms, reducing inflammation and improving clinical status. However, larger, standardized studies are needed to confirm these findings and guide clinical use. Full article
(This article belongs to the Special Issue Clinical Insights into Pediatric Critical Care)
Show Figures

Figure 1

19 pages, 4477 KB  
Article
Non-Contact Heart Rate Variability Monitoring with FMCW Radar via a Novel Signal Processing Algorithm
by Guangyu Cui, Yujie Wang, Xinyi Zhang, Jiale Li, Xinfeng Liu, Bijie Li, Jiayi Wang and Quan Zhang
Sensors 2025, 25(17), 5607; https://doi.org/10.3390/s25175607 - 8 Sep 2025
Abstract
Heart rate variability (HRV), which quantitatively characterizes fluctuations in beat-to-beat intervals, serves as a critical indicator of cardiovascular and autonomic nervous system health. The inherent ability of non-contact methods to eliminate the need for subject contact effectively mitigates user burden and facilitates scalable [...] Read more.
Heart rate variability (HRV), which quantitatively characterizes fluctuations in beat-to-beat intervals, serves as a critical indicator of cardiovascular and autonomic nervous system health. The inherent ability of non-contact methods to eliminate the need for subject contact effectively mitigates user burden and facilitates scalable long-term monitoring, thus attracting considerable research interest in non-contact HRV sensing. In this study, we propose a novel algorithm for HRV extraction utilizing FMCW millimeter-wave radar. First, we developed a calibration-free 3D target positioning module that captures subjects’ micro-motion signals through the integration of digital beamforming, moving target indication filtering, and DBSCAN (Density-Based Spatial Clustering of Applications with Noise) clustering techniques. Second, we established separate phase-based mathematical models for respiratory and cardiac vibrations to enable systematic signal separation. Third, we implemented the Second Order Spectral Sparse Separation Algorithm Using Lagrangian Multipliers, thereby achieving robust heartbeat extraction in the presence of respiratory movements and noise. Heartbeat events are identified via peak detection on the recovered cardiac signal, from which inter-beat intervals and HRV metrics are subsequently derived. Compared to state-of-the-art algorithms and traditional filter bank approaches, the proposed method demonstrated an over 50% reduction in average IBI (Inter-Beat Interval) estimation error, while maintaining consistent accuracy across all test scenarios. However, it should be noted that the method is currently applicable only to scenarios with limited subject movement and has been validated in offline mode, but a discussion addressing these two issues is provided at the end. Full article
(This article belongs to the Section Biomedical Sensors)
Show Figures

Figure 1

42 pages, 1748 KB  
Article
Memory-Augmented Large Language Model for Enhanced Chatbot Services in University Learning Management Systems
by Jaeseung Lee and Jehyeok Rew
Appl. Sci. 2025, 15(17), 9775; https://doi.org/10.3390/app15179775 - 5 Sep 2025
Viewed by 369
Abstract
A learning management system (LMS) plays a crucial role in supporting students’ educational activities by centralized platforms for course delivery, communication, and student support. Recently, many universities have integrated chatbots into their LMS to assist students with various inquiries and tasks. However, existing [...] Read more.
A learning management system (LMS) plays a crucial role in supporting students’ educational activities by centralized platforms for course delivery, communication, and student support. Recently, many universities have integrated chatbots into their LMS to assist students with various inquiries and tasks. However, existing chatbots often necessitate human interventions to manually respond to complex queries, resulting in limited scalability and efficiency. In this paper, we present a memory-augmented large language model (LLM) framework that enhances the reasoning and contextual continuity of LMS-based chatbots. The proposed framework first embeds user queries and retrieves semantically relevant entries from various LMS resources, including instructional documents and academic frequently asked questions. Retrieved entries are then filtered through a two-stage confidence filtering process that combines similarity thresholds and LLM-based semantic validation. Validated information, along with user queries, is processed by LLM for response generation. To maintain coherence in multi-turn interactions, the chatbot incorporates short-term, long-term, and temporal event memories, which track conversational flow and personalize responses based on user-specific information, such as recent activity history and individual preferences. To evaluate response quality, we employed a multi-layered evaluation strategy combining BERTScore-based quantitative measurement, an LLM-as-a-Judge approach for automated semantic assessment, and a user study under multi-turn scenarios. The evaluation results consistently confirm that the proposed framework improves the consistency, clarity, and usefulness of the responses. These findings highlight the potential of memory-augmented LLMs for scalable and intelligent learning support within university environments. Full article
(This article belongs to the Special Issue Applications of Digital Technology and AI in Educational Settings)
Show Figures

Figure 1

21 pages, 4327 KB  
Article
Event-Triggered Control of Grid-Connected Inverters Based on LPV Model Approach
by Wensheng Luo, Zhiwei Zhang, Zejian Shu, Haibin Li and Jianwen Zhang
Energies 2025, 18(17), 4739; https://doi.org/10.3390/en18174739 - 5 Sep 2025
Viewed by 416
Abstract
This study aims to develop an event-triggered control strategy of grid-connected inverters, based on the linear parameter-varying (LPV) modeling approach. Regarding the changes in grid voltage, filter capacitance and inductance, and random electromagnetic interference, a stochastic LPV model for three-phase two-level inverters is [...] Read more.
This study aims to develop an event-triggered control strategy of grid-connected inverters, based on the linear parameter-varying (LPV) modeling approach. Regarding the changes in grid voltage, filter capacitance and inductance, and random electromagnetic interference, a stochastic LPV model for three-phase two-level inverters is established. To reduce computation burden, an event trigger with a continuous-time form is adopted to derive the state feedback controller for the LPV plant. Unlike the existing common approach to dealing with event-triggered mechanisms, a predesignated event-triggering threshold is used to determine the triggering instant of the event condition. Using parameter-dependent Lyapunov functions, sufficient conditions reliant on parameters are introduced. Based on the derived conditions, the corresponding event-triggered controllers are engineered to ensure uniform ultimate bounded stability for the resulting event-triggered LPV inverter system subject to exogenous disturbance. The simulation results are presented to confirm the efficacy of the proposed methods. Full article
(This article belongs to the Special Issue Control and Optimization of Power Converters)
Show Figures

Figure 1

22 pages, 3112 KB  
Article
Health Assessment of Zoned Earth Dams by Multi-Epoch In Situ Investigations and Laboratory Tests
by Ernesto Ausilio, Maria Giovanna Durante, Roberto Cairo and Paolo Zimmaro
Geotechnics 2025, 5(3), 60; https://doi.org/10.3390/geotechnics5030060 - 3 Sep 2025
Viewed by 258
Abstract
The long-term safety and operational reliability of zoned earth dams depend on the structural integrity of their internal components, including core, filters, and shell zones. This is particularly relevant for old dams which have been operational for a long period of time. Such [...] Read more.
The long-term safety and operational reliability of zoned earth dams depend on the structural integrity of their internal components, including core, filters, and shell zones. This is particularly relevant for old dams which have been operational for a long period of time. Such existing infrastructure systems are exposed to various loading types over time, including environmental, seepage-related, extreme event, and climate change effects. As a result, even when they look intact externally, changes might affect their internal structure, composition, and possibly functionality. Thus, it is important to delineate a comprehensive and cost-effective strategy to identify potential issues and derive the health status of existing earth dams. This paper outlines a systematic approach for conducting a comprehensive health check of these structures through the implementation of a multi-epoch geotechnical approach based on a variety of standard measured and monitored quantities. The goal is to compare current properties with baseline data obtained during pre-, during-, and post-construction site investigation and laboratory tests. Guidance is provided on how to judge such multi-epoch comparisons, identifying potential outcomes and scenarios. The proposed approach is tested on a well-documented case study in Southern Italy, an area prone to climate change and subjected to very high seismic hazard. The case study demonstrates how the integration of historical and contemporary geotechnical data allows for the identification of critical zones requiring attention, the validation of numerical models, and the proactive formulation of targeted maintenance and rehabilitation strategies. This comprehensive, multi-epoch-based approach provides a robust and reliable assessment of dams’ health, enabling better-informed decision-making workflows and processes for asset management and risk mitigation strategies. Full article
(This article belongs to the Special Issue Recent Advances in Geotechnical Engineering (3rd Edition))
Show Figures

Figure 1

16 pages, 2074 KB  
Article
Benchmarking Control Strategies for Multi-Component Degradation (MCD) Detection in Digital Twin (DT) Applications
by Atuahene Kwasi Barimah, Akhtar Jahanzeb, Octavian Niculita, Andrew Cowell and Don McGlinchey
Computers 2025, 14(9), 356; https://doi.org/10.3390/computers14090356 - 29 Aug 2025
Viewed by 343
Abstract
Digital Twins (DTs) have become central to intelligent asset management within Industry 4.0, enabling real-time monitoring, diagnostics, and predictive maintenance. However, implementing Prognostics and Health Management (PHM) strategies within DT frameworks remains a significant challenge, particularly in systems experiencing multi-component degradation (MCD). MCD [...] Read more.
Digital Twins (DTs) have become central to intelligent asset management within Industry 4.0, enabling real-time monitoring, diagnostics, and predictive maintenance. However, implementing Prognostics and Health Management (PHM) strategies within DT frameworks remains a significant challenge, particularly in systems experiencing multi-component degradation (MCD). MCD occurs when several components degrade simultaneously or in interaction, complicating detection and isolation processes. Traditional data-driven fault detection models often require extensive historical degradation data, which is costly, time-consuming, or difficult to obtain in many real-world scenarios. This paper proposes a model-based, control-driven approach to MCD detection, which reduces the need for large training datasets by leveraging reference tracking performance in closed-loop control systems. We benchmark the accuracy of four control strategies—Proportional-Integral (PI), Linear Quadratic Regulator (LQR), Model Predictive Control (MPC), and a hybrid model—within a Digital Twin-enabled hydraulic system testbed comprising multiple components, including pumps, valves, nozzles, and filters. The control strategies are evaluated under various MCD scenarios for their ability to accurately detect and isolate degradation events. Simulation results indicate that the hybrid model consistently outperforms the individual control strategies, achieving an average accuracy of 95.76% under simultaneous pump and nozzle degradation scenarios. The LQR model also demonstrated strong predictive performance, especially in identifying degradation in components such as nozzles and pumps. Also, the sequence and interaction of faults were found to influence detection accuracy, highlighting how the complexities of fault sequences affect the performance of diagnostic strategies. This work contributes to PHM and DT research by introducing a scalable, data-efficient methodology for MCD detection that integrates seamlessly into existing DT architectures using containerized RESTful APIs. By shifting from data-dependent to model-informed diagnostics, the proposed approach enhances early fault detection capabilities and reduces deployment timelines for real-world DT-enabled PHM applications. Full article
(This article belongs to the Section Internet of Things (IoT) and Industrial IoT)
Show Figures

Figure 1

16 pages, 2337 KB  
Article
Lake-Effect Snowfall Climatology over Lake Champlain: A Comparative Analysis of the 2015–2024 and 1997–2006 Periods
by Kazimir D. Nyzio and Ping Liu
Atmosphere 2025, 16(9), 1011; https://doi.org/10.3390/atmos16091011 - 28 Aug 2025
Viewed by 473
Abstract
This study updates the climatology of lake-effect (LE) snowfall over Lake Champlain by analyzing radar and surface data from nine winter seasons spanning 2015 to 2024. A filtering approach was applied to isolate periods with favorable LE conditions, and events were manually classified [...] Read more.
This study updates the climatology of lake-effect (LE) snowfall over Lake Champlain by analyzing radar and surface data from nine winter seasons spanning 2015 to 2024. A filtering approach was applied to isolate periods with favorable LE conditions, and events were manually classified using criteria consistent with a previous climatology from 1997 to 2006. A total of 64 LE events were identified and compared across the two periods to evaluate potential changes associated with regional warming. Despite a substantial reduction in lake ice cover during the recent decades, no increase in LE frequency or duration was observed. Instead, warming has shifted the seasonal distribution of events, with fewer early-season cases and more late-season occurrences. LE events also exhibited shorter durations and higher minimum temperatures and dew points. These findings suggest that warming may constrain LE snowfall development over small lakes such as Champlain, in contrast to intensification trends reported for larger lake systems. The analysis also highlights a rarely documented transitional band type that migrated along the lake axis during synoptic shifts. Results underscore the value of observational climatologies for detecting emerging snowfall behaviors in response to climate variability. Full article
(This article belongs to the Section Climatology)
Show Figures

Figure 1

34 pages, 2708 KB  
Article
Integrating Temporal Event Prediction and Large Language Models for Automatic Commentary Generation in Video Games
by Xuanyu Sheng, Aihe Yu, Mingfeng Zhang, Gayoung An, Jisun Park and Kyungeun Cho
Mathematics 2025, 13(17), 2738; https://doi.org/10.3390/math13172738 - 26 Aug 2025
Viewed by 618
Abstract
Game commentary enhances viewer immersion and understanding, particularly in football video games, where dynamic gameplay offers ideal conditions for automated commentary. The existing methods often rely on predefined templates and game state inputs combined with an LLM, such as GPT-3.5. However, they frequently [...] Read more.
Game commentary enhances viewer immersion and understanding, particularly in football video games, where dynamic gameplay offers ideal conditions for automated commentary. The existing methods often rely on predefined templates and game state inputs combined with an LLM, such as GPT-3.5. However, they frequently suffer from repetitive phrasing and delayed responses. Recent studies have attempted to mitigate the response delays by employing traditional machine learning models, such as SVM and ANN, for event prediction. Nonetheless, these models fail to capture the temporal dependencies in gameplay sequences, thereby limiting their predictive performance. To address these limitations, an integrated framework is proposed, combining a lightweight convolutional model with multi-scale temporal filters (OS-CNN) for real-time event prediction and an open-source LLM (LLaMA 3.3) for dynamic commentary generation. Our method incorporates prompt engineering techniques by embedding predicted events into contextualized instruction templates, which enables the LLM to produce fluent and diverse commentary tailored to ongoing gameplay. Evaluated in the Google Research Football environment, the proposed method achieved an F1-score of 0.7470 in the balanced setting, closely matching the best-performing GRU model (0.7547) while outperforming SVM (0.5271) and Transformer (0.7344). In the more realistic Balanced–Imbalanced setting, it attained the highest F1-score of 0.8503, substantially exceeding SVM (0.4708), GRU (0.7376), and Transformer (0.5085). Additionally, it enhances the lexical diversity (Distinct-2: +32.1%) and reduces the phrase repetition by 42.3% (Self-BLEU), compared with template-based generation. These results demonstrate the effectiveness of our approach in generating context-aware, low-latency, and natural commentary suitable for real-time deployment in football video games. Full article
Show Figures

Figure 1

22 pages, 4564 KB  
Article
Quantification of the Spatial Heterogeneity of PM2.5 to Support the Evaluation of Low-Cost Sensors: A Long-Term Urban Case Study
by Róbert Mészáros, Zoltán Barcza, Bushra Atfeh, Roland Hollós, Erzsébet Kristóf, Ágoston Vilmos Tordai and Veronika Groma
Atmosphere 2025, 16(9), 998; https://doi.org/10.3390/atmos16090998 - 23 Aug 2025
Viewed by 503
Abstract
During the last decades, development of novel low-cost sensors commercialized for indoor air quality measurements has gained interest. In this research, three AirVisual Pro air quality monitors were used to monitor PM2.5 and carbon dioxide concentrations in which two were installed indoors [...] Read more.
During the last decades, development of novel low-cost sensors commercialized for indoor air quality measurements has gained interest. In this research, three AirVisual Pro air quality monitors were used to monitor PM2.5 and carbon dioxide concentrations in which two were installed indoors and one outdoors at two residential apartments in Central Europe (Budapest, Hungary). In our research, we present a methodology to support the evaluation of indoor sensors by utilizing official outdoor monitoring data, leveraging the fact that indoor spaces are frequently ventilated and thus influenced by outdoor conditions. We compared six-year measurement data (January 2017–December 2022) with outdoor concentrations provided by the Hungarian Air Quality Monitoring Network (HAQM). However, the well-known low spatial representativeness and high spatio-temporal variability of PM2.5 in city environments made this evaluation problematic, which needed to be addressed before comparison. Here we quantify the spatial heterogeneity of the HAQM PM2.5 data for a maximum of eight stations. Then, based on the carbon dioxide readings of the AirVisual Pro units, data filtering was performed for the AirVisual 1 and AirVisual 2 sensors located in indoor environments to identify ventilated periods (nearly 10,000 ventilated events) for the AirVisual 1 and AirVisual 2 sensors, respectively, for the comparison of indoor and outdoor PM2.5 concentrations. The AirVisual 3 sensor was placed in a garden storage, and the measurements taken there were considered outdoor values throughout. Finally, four heterogeneity criteria were set for the HAQM data to filter conditions that were assumed to be comparable with the indoor sensor data. The results indicate that the spatial heterogeneity was indeed detectable, and in approximately 50–60% of the cases, the readings could be considered as non-representative to single location comparison, but the results depend on the selected homogeneity criteria. The AirVisual and HAQM comparison indicated relatively low sensitivity to heterogeneity criteria, which is a promising result that can be exploited. AirVisual sensors generally overestimated PM2.5, but this bias could be corrected with a simple linear adjustment. Slopes changed across sensors (0.83–0.85 for AirVisual 1, 0.48–0.53 for AirVisual 2, and 0.70–0.73 for AirVisual 3), indicating general overestimation and correlations from moderate to high (R2 = 0.45–0.89) depending on the device. In contrast, when we compared the measurements only with data from the nearest reference station, we obtained a weaker match and slopes that did not match those calculated by taking into account homogeneity criteria. This research contributes to the proliferation of citizen science and supports the application of LCSs in indoor conditions. Full article
(This article belongs to the Section Atmospheric Techniques, Instruments, and Modeling)
Show Figures

Figure 1

20 pages, 3583 KB  
Article
Assessment of Radionuclide Contamination in Foreshore Sands of the Baltic Sea near Juodkrante in Lithuania
by Artūras Jukna and Gražina Grigaliūnaitė-Vonsevičienė
Sustainability 2025, 17(16), 7441; https://doi.org/10.3390/su17167441 - 18 Aug 2025
Viewed by 538
Abstract
This study presents a methodological approach to assess radionuclide contamination in the Baltic Sea near Juodkrante, Lithuania, based on measurements of β- and γ-emissions in seawater, foreshore sand, and dune top sand. Existing assessments often lack sufficient site-specific detail and multicompartment analysis, limiting [...] Read more.
This study presents a methodological approach to assess radionuclide contamination in the Baltic Sea near Juodkrante, Lithuania, based on measurements of β- and γ-emissions in seawater, foreshore sand, and dune top sand. Existing assessments often lack sufficient site-specific detail and multicompartment analysis, limiting the understanding of localized contamination and radionuclide behavior in coastal environments. Sampling was carried out between 2019 and 2024 at approximately the same geographical coordinates, along transects orientated normally to the shoreline. Given that the dune top remains unaffected by seawater intrusion, while the foreshore sand is subject to regular inundation, the foreshore environment is considered a natural filter that is capable of accumulating radionuclides from seawater. The proposed methodology supports the hypothesis that radionuclide retention in sandy substrates may persist beyond episodic contamination events in seawater, with retention dynamics influenced by environmental factors such as hydrodynamic conditions and aeolian processes. Simultaneous β- and γ-emission analysis enhances the precision of radionuclide quantification, while comparative evaluation of γ-spectra improves the detection of both natural and anthropogenic radionuclides, providing insight into both contemporary and historical contamination processes. The sustainability of the proposed approach lies in its efficient use of time, resources, and effort to monitor radionuclide contamination. Unlike conventional techniques that require energy-intensive seawater processing, this approach uses foreshore sand, which passively accumulates radionuclides through natural wave-driven deposition. Full article
(This article belongs to the Section Environmental Sustainability and Applications)
Show Figures

Figure 1

21 pages, 10507 KB  
Article
Conditional Random Field Approach Combining FFT Filtering and Co-Kriging for Reliability Assessment of Slopes
by Xin Dong, Tianhong Yang, Yuan Gao, Wenxue Deng, Yang Liu, Peng Niu, Shihui Jiao and Yong Zhao
Appl. Sci. 2025, 15(16), 8858; https://doi.org/10.3390/app15168858 - 11 Aug 2025
Viewed by 341
Abstract
Conventional unconditional random field (URF) models were shown to neglect in-situ monitoring data and thus misrepresent real slope stability. To address this, a conditional random field (CRF) generator was proposed, in which Fast Fourier Transform (FFT) filtering was coupled with co-Kriging to assimilate [...] Read more.
Conventional unconditional random field (URF) models were shown to neglect in-situ monitoring data and thus misrepresent real slope stability. To address this, a conditional random field (CRF) generator was proposed, in which Fast Fourier Transform (FFT) filtering was coupled with co-Kriging to assimilate site observations. A representative three-bench slope was adopted, and the failure-mode distribution and the statistics of the factor of safety (FoS) produced by the URF, the independent random field (IRF), and the CRF were examined across bedding-dip angles of 15–75° and two cross-correlation states (ρ = −0.2, 0). It was found that eliminating cross-correlation decreased the mean FoS by 0.006, increased its standard deviation by 10.26%, and raised the frequency of low-FoS events from 7.49% to 12.30%. When field constraints were imposed through the CRF, the probability of through-going failure was reduced by 12%, the mean FoS was increased by 0.01, the standard deviation was reduced by 15.38%, and low-FoS events were suppressed to 2.30%. The CRF framework was thus demonstrated to integrate stochastic analysis with field measurements, enabling more realistic reliability assessment and proactive risk management of slopes. Full article
Show Figures

Figure 1

24 pages, 27873 KB  
Article
Atmospheric Boundary Layer Height Estimation from Lidar Observations: Assessment and Validation of MIPA Algorithm
by Giuseppe D’Amico, Alberto Arienzo, Gemine Vivone, Aldo Amodeo, Francesco Cardellicchio, Pilar Gumà-Claramunt, Benedetto De Rosa, Paolo Di Girolamo, Ilaria Gandolfi, Aldo Giunta, Teresa Laurita, Fabrizio Marra, Lucia Mona, Michail Mytilinaios, Nikolaos Papagiannopoulos, Marco Rosoldi and Donato Summa
Remote Sens. 2025, 17(16), 2748; https://doi.org/10.3390/rs17162748 - 8 Aug 2025
Viewed by 437
Abstract
The assessment and optimization of the MIPA (Morphological Image Processing Approach) algorithm for the retrieval of Atmospheric Boundary Layer Height (ABLH) from Aerosol High-power Lidars (AHL) data are presented. MIPA has been developed at CNR-IMAA in the framework of ACTRIS, and it was [...] Read more.
The assessment and optimization of the MIPA (Morphological Image Processing Approach) algorithm for the retrieval of Atmospheric Boundary Layer Height (ABLH) from Aerosol High-power Lidars (AHL) data are presented. MIPA has been developed at CNR-IMAA in the framework of ACTRIS, and it was tested on several lidar datasets, showing, in general, a good agreement with the traditional ABLH retrieval techniques. The main innovative feature of MIPA with respect to other approaches consists in applying optimized morphological filters and object-oriented analysis on lidar timeseries to obtain ABLH estimates. In this study, we carried out a robust MIPA validation effort based on a dedicated measurement campaign organized at CIAO (CNR-IMAA Atmospheric Observatory) in Spring 2024, where several lidar systems were operating continuously along with a quite complete set of other atmospheric sensors and two radiosounding systems. During the campaign, several case studies were considered for MIPA validation, each characterized by an intensive radiosonde schedule to ensure the establishment of a representative ABLH reference dataset. The ABLH retrieved by MIPA was compared against the corresponding ones obtained by radiosonde data. We observed a good overall agreement under different atmospheric conditions, ranging from intense dust events penetrating the ABL to cleaner atmospheric conditions. The best agreement between MIPA and reference dataset is obtained for longer wavelengths (532 nm and 1064 nm) and during daytime conditions. Full article
Show Figures

Graphical abstract

19 pages, 3549 KB  
Article
Method for Target Detection in a High Noise Environment Through Frequency Analysis Using an Event-Based Vision Sensor
by Will Johnston, Shannon Young, David Howe, Rachel Oliver, Zachry Theis, Brian McReynolds and Michael Dexter
Signals 2025, 6(3), 39; https://doi.org/10.3390/signals6030039 - 5 Aug 2025
Viewed by 547
Abstract
Event-based vision sensors (EVSs), often referred to as neuromorphic cameras, operate by responding to changes in brightness on a pixel-by-pixel basis. In contrast, traditional framing cameras employ some fixed sampling interval where integrated intensity is read off the entire focal plane at once. [...] Read more.
Event-based vision sensors (EVSs), often referred to as neuromorphic cameras, operate by responding to changes in brightness on a pixel-by-pixel basis. In contrast, traditional framing cameras employ some fixed sampling interval where integrated intensity is read off the entire focal plane at once. Similar to traditional cameras, EVSs can suffer loss of sensitivity through scenes with high intensity and dynamic clutter, reducing the ability to see points of interest through traditional event processing means. This paper describes a method to reduce the negative impacts of these types of EVS clutter and enable more robust target detection through the use of individual pixel frequency analysis, background suppression, and statistical filtering. Additionally, issues found in normal frequency analysis such as phase differences between sources, aliasing, and spectral leakage are less relevant in this method. The statistical filtering simply determines what pixels have significant frequency content after the background suppression instead of focusing on the actual frequencies in the scene. Initial testing on simulated data demonstrates a proof of concept for this method, which reduces artificial scene noise and enables improved target detection. Full article
Show Figures

Figure 1

Back to TopTop