Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (157)

Search Parameters:
Keywords = plausibility of decisions

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
12 pages, 351 KB  
Review
Ocular Effects of GLP-1 Receptor Agonists: A Review of Current Evidence and Safety Concerns
by Giuseppe Maria Albanese, Giacomo Visioli, Ludovico Alisi, Francesca Giovannetti, Luca Lucchino, Marta Armentano, Fiammetta Catania, Marco Marenco and Magda Gharbiya
Diabetology 2025, 6(10), 117; https://doi.org/10.3390/diabetology6100117 - 10 Oct 2025
Abstract
Glucagon-like peptide-1 receptor agonists (GLP-1RAs) have emerged as cornerstone therapies for type 2 diabetes mellitus and obesity, offering significant cardiovascular and renal protection. However, recent evidence has sparked interest and concern regarding their potential ocular effects. This review critically synthesizes current data on [...] Read more.
Glucagon-like peptide-1 receptor agonists (GLP-1RAs) have emerged as cornerstone therapies for type 2 diabetes mellitus and obesity, offering significant cardiovascular and renal protection. However, recent evidence has sparked interest and concern regarding their potential ocular effects. This review critically synthesizes current data on the impact of GLP-1RAs on diabetic retinopathy (DR), nonarteritic anterior ischemic optic neuropathy (NAION), age-related macular degeneration (AMD), and glaucoma or ocular hypertension. While preclinical studies suggest GLP-1RAs exert anti-inflammatory and neuroprotective effects in retinal tissues, clinical data remain mixed. Several large observational studies suggest a protective role against DR and glaucoma, while others raise safety concerns, particularly regarding semaglutide and NAION. Evidence on AMD is conflicting, with signals of both benefit and risk. We also discuss plausible pathophysiological mechanisms and the relevance of metabolic modulation on retinal perfusion. Overall, while GLP-1RAs hold promise for ocular protection in some contexts, vigilance is warranted, especially in patients with pre-existing eye disease. Further ophthalmology-focused prospective trials are essential to clarify long-term safety and guide clinical decision making. Full article
Show Figures

Figure 1

8 pages, 1017 KB  
Case Report
Isolated Phlegmon of the Round Ligament of the Liver: Clinical Decision-Making in the Context of Lemmel’s Syndrome—A Case Report
by Georgi Popivanov, Marina Konaktchieva, Roberto Cirocchi, Desislava Videva and Ventsislav Mutafchiyski
Reports 2025, 8(4), 192; https://doi.org/10.3390/reports8040192 - 29 Sep 2025
Viewed by 208
Abstract
Background and Clinical Significance: The pathology of the round ligament (RL) is rare and often remains in the shadow of common surgical emergencies. The preoperative diagnosis is challenging, leaving the surgeon perplexed as to whether and when to operate. The presented case [...] Read more.
Background and Clinical Significance: The pathology of the round ligament (RL) is rare and often remains in the shadow of common surgical emergencies. The preoperative diagnosis is challenging, leaving the surgeon perplexed as to whether and when to operate. The presented case deserves attention due to the difficult decision to operate based solely on the clinical picture, despite negative imaging diagnostic results. Case presentation: A 76-year-old woman was admitted to the Emergency Department with 6 h complaints of epigastric pain, nausea, and vomiting. She was afebrile with stable vital signs. The abdomen was slightly tender in the epigastrium, without rebound tenderness or guarding. The following blood variables were beyond the normal range: WBC—13.5 × 109/L; total bilirubin 26 mmol/L; amylase—594 U/L; CRP 11.4 mg/L; ASAT—158 U/L; and ALAT—95 U/L. The ultrasound (US) and multislice computed tomography (MSCT) of the abdomen were normal. A working diagnosis of acute pancreatitis was established, and intravenous infusions were initiated. The next day, the patient became hemodynamically unstable with blood pressure 80/60 mm Hg, heart rate 130/min, chills and fever of 39.5 °C, and oliguria. There was remarkable guarding and rebound tenderness in the epigastrium. The blood analysis revealed the following: WBC—9.9 × 109/L; total bilirubin—76 µmol/L; direct bilirubin—52 µmol/L; amylase—214 U/L; CRP 245 mg/L; ASAT—161 U/L; ALAT—132 U/L; GGT—272 U/L; urea—15.7 mmol/L; and creatinine—2.77 mg/dL. She was taken to the operating room for exploration, which revealed local peritonitis and phlegmon of the RL. Resection of the RL was performed. The microbiological analysis showed Klebsiella varicola. The patient had an uneventful recovery and was discharged on the 5th postoperative day. In the next months, the patients had several readmissions due to mild cholestasis and pancreatitis. The magnetic resonance demonstrated a duodenal diverticulum adjacent to the papilla, located near the junction of the common bile and pancreatic duct. This clinical manifestation and the location of the diverticulum were suggestive of Lemmel’s syndrome, but a papillary dysfunction attributed to the diverticulum or food stasis cannot be excluded. Conclusion: To our knowledge, we report the first association between RL gangrene and Lemmel’s syndrome. We speculate that duodenal diverticulitis with lymphatic spread of the infection or transient bacteriemia in the bile with bacterial translocation due to papillary dysfunction, as well as cholestasis resulting from the diverticulum, could be plausible and unreported causes of the RL infection. The preoperative diagnosis of RL gangrene is challenging because it resembles the most common emergency conditions in the upper abdomen. The present case warrants attention due to the difficult decision to operate based solely on the clinical picture, despite negative imaging results. A high index of suspicion should be maintained in a case of unexplained septic shock and epigastric tenderness, even in negative imaging findings. MSCT, however, is a valuable tool to avert unnecessary operations in conditions that must be managed conservatively, such as acute pancreatitis. Full article
(This article belongs to the Section Surgery)
Show Figures

Figure 1

28 pages, 20784 KB  
Article
Systematic Parameter Optimization for LoRA-Based Architectural Massing Generation Using Diffusion Models
by Soon Min Hong and Seungyeon Choo
Buildings 2025, 15(19), 3477; https://doi.org/10.3390/buildings15193477 - 26 Sep 2025
Viewed by 341
Abstract
This study addresses the systematic optimization of Low-Rank Adaptation (LoRA) parameters for architectural knowledge integration in diffusion models, where existing AI research has provided limited guidance for establishing plausible parameter ranges in architectural massing applications. While diffusion models show increasing utilization in architectural [...] Read more.
This study addresses the systematic optimization of Low-Rank Adaptation (LoRA) parameters for architectural knowledge integration in diffusion models, where existing AI research has provided limited guidance for establishing plausible parameter ranges in architectural massing applications. While diffusion models show increasing utilization in architectural design, general models lack domain-specific architectural knowledge, and previous studies have offered insufficient hyperparameter optimization frameworks for architectural massing studies—fundamental components for expressing architectural knowledge. This research establishes a comprehensive LoRA training framework specifically for architectural mass generation, systematically evaluating caption detail levels, optimizers, learning rates, schedulers, batch sizes, and training steps. Through analysis of 220 architectural mass images representing spatial transformation operations, the study recommends the following parameter settings: detailed captions, Adafactor optimizer, learning rate 0.0003, constant scheduler, and batch size 4, achieving significant improvements in prompt-to-output fidelity compared to baseline approaches. The contribution of this study is not in introducing a new algorithm, but in providing a systematic application of LoRA in the architectural domain, serving as a bridging milestone for both emerging architectural-AI researchers and advanced scholars. The findings provide practical guidelines for integrating AI technologies into architectural design workflows, while demonstrating how systematic parameter optimization can enhance the learning of architectural knowledge and support architects in early-stage massing and design decision-making. Full article
(This article belongs to the Special Issue Artificial Intelligence in Architecture and Interior Design)
Show Figures

Figure 1

30 pages, 2274 KB  
Article
Biologically Based Intelligent Multi-Objective Optimization for Automatically Deriving Explainable Rule Set for PV Panels Under Antarctic Climate Conditions
by Erhan Arslan, Ebru Akpinar, Mehmet Das, Burcu Özsoy, Gungor Yildirim and Bilal Alatas
Biomimetics 2025, 10(10), 646; https://doi.org/10.3390/biomimetics10100646 - 25 Sep 2025
Viewed by 313
Abstract
Antarctic research stations require reliable low-carbon power under extreme conditions. This study compiles a synchronized PV-meteorological time-series data set on Horseshoe Island (Antarctica) at 30 s, 1 min, and 5 min resolutions and compares four PV module types (monocrystalline, polycrystalline, flexible mono, and [...] Read more.
Antarctic research stations require reliable low-carbon power under extreme conditions. This study compiles a synchronized PV-meteorological time-series data set on Horseshoe Island (Antarctica) at 30 s, 1 min, and 5 min resolutions and compares four PV module types (monocrystalline, polycrystalline, flexible mono, and semitransparent) under controlled field operation. Model development adopts an interpretable, multi-objective framework: a modified SPEA-2 searches rule sets on the Pareto front that jointly optimize precision and recall, yielding transparent, physically plausible decision rules for operational use. For context, benchmark machine-learning models (e.g., kNN, SVM) are evaluated on the same splits. Performance is reported with precision, recall, and complementary metrics (F1, balanced accuracy, and MCC), emphasizing class-wise behavior and robustness. Results show that the proposed rule-based approach attains competitive predictive performance while retaining interpretability and stability across panel types and sampling intervals. Contributions are threefold: (i) a high-resolution field data set coupling PV output with solar radiation, temperature, wind, and humidity in polar conditions; (ii) a Pareto-front, explainable rule-extraction methodology tailored to small-power PV; and (iii) a comparative assessment against standard ML baselines using multiple, class-aware metrics. The resulting XAI models achieved 92.3% precision and 89.7% recall. The findings inform the design and operation of PV systems for harsh, high-latitude environments. Full article
(This article belongs to the Section Biological Optimisation and Management)
Show Figures

Figure 1

22 pages, 3553 KB  
Article
An Extended Epistemic Framework Beyond Probability for Quantum Information Processing with Applications in Security, Artificial Intelligence, and Financial Computing
by Gerardo Iovane
Entropy 2025, 27(9), 977; https://doi.org/10.3390/e27090977 - 18 Sep 2025
Viewed by 312
Abstract
In this work, we propose a novel quantum-informed epistemic framework that extends the classical notion of probability by integrating plausibility, credibility, and possibility as distinct yet complementary measures of uncertainty. This enriched quadruple (P, Pl, Cr, Ps) enables a deeper characterization of quantum [...] Read more.
In this work, we propose a novel quantum-informed epistemic framework that extends the classical notion of probability by integrating plausibility, credibility, and possibility as distinct yet complementary measures of uncertainty. This enriched quadruple (P, Pl, Cr, Ps) enables a deeper characterization of quantum systems and decision-making processes under partial, noisy, or ambiguous information. Our formalism generalizes the Born rule within a multi-valued logic structure, linking Positive Operator-Valued Measures (POVMs) with data-driven plausibility estimators, agent-based credibility priors, and fuzzy-theoretic possibility functions. We develop a hybrid classical–quantum inference engine that computes a vectorial aggregation of the quadruples, enhancing robustness and semantic expressivity in contexts where classical probability fails to capture non-Kolmogorovian phenomena such as entanglement, contextuality, or decoherence. The approach is validated through three real-world application domains—quantum cybersecurity, quantum AI, and financial computing—where the proposed model outperforms standard probabilistic reasoning in terms of accuracy, resilience to noise, interpretability, and decision stability. Comparative analysis against QBism, Dempster–Shafer, and fuzzy quantum logic further demonstrates the uniqueness of architecture in both operational semantics and practical outcomes. This contribution lays the groundwork for a new theory of epistemic quantum computing capable of modelling and acting under uncertainty beyond traditional paradigms. Full article
(This article belongs to the Special Issue Probability Theory and Quantum Information)
Show Figures

Figure 1

21 pages, 790 KB  
Article
WHO–WHAT–HOW: A Product Operating Model for Agile, Technology-Enabled Digital Transformation
by Raul Ionuț Riti, Claudiu Ioan Abrudan, Laura Bacali and Nicolae Bâlc
Adm. Sci. 2025, 15(9), 368; https://doi.org/10.3390/admsci15090368 - 17 Sep 2025
Viewed by 492
Abstract
Organizations face rising market volatility, while legacy, plan-driven structures struggle to translate strategy into adaptive execution. Prior studies discuss product-centric operating models, yet typically treat decision rights, product definition, and technology-enabled execution separately. This paper introduces the WHO–WHAT–HOW framework, an authorial synthesis that [...] Read more.
Organizations face rising market volatility, while legacy, plan-driven structures struggle to translate strategy into adaptive execution. Prior studies discuss product-centric operating models, yet typically treat decision rights, product definition, and technology-enabled execution separately. This paper introduces the WHO–WHAT–HOW framework, an authorial synthesis that links decision boundaries (WHO), product scope and value hypotheses (WHAT), and workflow and technology routines (HOW) into a single, operational model. A triangulated design is employed, comprising a systematic document analysis of 62 sources published between 2018 and 2024, illustrative case studies of Amazon and Spotify, and a scenario-based organizational illustration that contrasts a baseline hierarchy with a WHO–WHAT–HOW configuration. Rather than constituting empirical validation, these elements serve as illustrative demonstrations of conceptual plausibility. Indicative composite indices, synthetically constructed from document-coded constructs and simulated rules, suggest improvements in decision speed, cycle time, and coordination; these indices are heuristic and non-inferential. The contribution is threefold: First, it provides a pragmatic five-step implementation roadmap. Then, we make the mechanisms concrete via a construct-to-rule mapping and three rule-based vignettes (incident pathway, value-hypothesis experiment, cross-team dependency), showing how WHO–WHAT–HOW compresses decision time, cycle time, and coordination without introducing new measurement programs. Finally, the composite indices remain heuristic and non-inferential. Limitations include reliance on secondary evidence and a scenario-based, non-empirical illustration; robust validation requires longitudinal, multi-sector primary data and testing in regulated or low-automation settings. Full article
Show Figures

Figure 1

16 pages, 2181 KB  
Article
A Hybrid Deep Learning and PINN Approach for Fault Detection and Classification in HVAC Transmission Systems
by Mohammed Almutairi and Wonsuk Ko
Energies 2025, 18(18), 4796; https://doi.org/10.3390/en18184796 - 9 Sep 2025
Viewed by 670
Abstract
High-Voltage Alternating Current (HVAC) transmission systems form the backbone of modern power grids, enabling efficient long-distance and high-capacity power delivery. In Saudi Arabia, ongoing initiatives to modernize and strengthen grid infrastructure demand advanced solutions to ensure system reliability, operational stability, and the minimization [...] Read more.
High-Voltage Alternating Current (HVAC) transmission systems form the backbone of modern power grids, enabling efficient long-distance and high-capacity power delivery. In Saudi Arabia, ongoing initiatives to modernize and strengthen grid infrastructure demand advanced solutions to ensure system reliability, operational stability, and the minimization of economic losses caused by faults. Traditional fault detection and classification methods often depend on the manual interpretation of voltage and current signals, which is both labor-intensive and prone to human error. Although data-driven approaches such as Artificial Neural Networks (ANNs) and Deep Learning have been applied to automate fault analysis, their performance is often constrained by the quality and size of available training datasets, leading to poor generalization and physically inconsistent outcomes. This study proposes a novel hybrid fault detection and classification framework for the 380 kV Marjan–Safaniyah HVAC transmission line by integrating Deep Learning with Physics-Informed Neural Networks (PINNs). The PINN model embeds fundamental electrical laws, such as Kirchhoff’s Current Law (KCL), directly into the learning process, thereby constraining predictions to physically plausible behaviors and enhancing robustness and accuracy. Developed in MATLAB/Simulink using the Deep Learning Toolbox, the proposed framework performs fault detection and fault type classification within a unified architecture. A comparative analysis demonstrates that the hybrid PINN approach significantly outperforms conventional Deep Learning models, particularly by reducing false negatives and improving class discrimination. Furthermore, this study highlights the crucial role of balanced and representative datasets in achieving a reliable performance. Validation through confusion matrices and KCL residual histograms confirms the enhanced physical consistency and predictive reliability of the model. Overall, the proposed framework provides a powerful and scalable solution for real-time monitoring, fault diagnosis, and intelligent decision-making in high-voltage power transmission systems. Full article
(This article belongs to the Special Issue Application of Artificial Intelligence in Electrical Power Systems)
Show Figures

Figure 1

29 pages, 529 KB  
Article
Fuzzy Multi-Criteria Decision Framework for Asteroid Selection in Boulder Capture Missions
by Nelson Ramírez, Juan Miguel Sánchez-Lozano and Eloy Peña-Asensio
Aerospace 2025, 12(9), 800; https://doi.org/10.3390/aerospace12090800 - 4 Sep 2025
Viewed by 530
Abstract
A systematic fuzzy multi-criteria decision making (MCDM) framework is proposed to prioritize near-Earth asteroids (NEAs) for a boulder capture mission, addressing the requirement for rigorous prioritization of asteroid candidates under conditions of data uncertainty. Twenty-eight NEA candidates were first selected through filtering based [...] Read more.
A systematic fuzzy multi-criteria decision making (MCDM) framework is proposed to prioritize near-Earth asteroids (NEAs) for a boulder capture mission, addressing the requirement for rigorous prioritization of asteroid candidates under conditions of data uncertainty. Twenty-eight NEA candidates were first selected through filtering based on physical and orbital properties. Then, objective fuzzy weighting MCDM methods (statistical variance, CRITIC, and MEREC) were applied to determine the importance of criteria such as capture cost, synodic period, rotation rate, orbit determination accuracy, and similarity to other candidates. Subsequent fuzzy ranking MCDM techniques (WASPAS, TOPSIS, MARCOS) generated nine prioritization schemes whose coherence was assessed via correlation analysis. An innovative sensitivity analysis employing Dirichlet-distributed random sampling around reference weights quantified ranking robustness. All methodologies combinations consistently identified the same top four asteroids, with 2013 NJ ranked first in every scenario, and stability metrics confirmed resilience to plausible weight variations. The modular MCDM methodology proposed provides mission planners with a reliable, adaptable decision support tool for asteroid selection, demonstrably narrowing broad candidate pools to robust targets while accommodating future data updates. Full article
Show Figures

Figure 1

20 pages, 5097 KB  
Article
A Robust Optimization Framework for Hydraulic Containment System Design Under Uncertain Hydraulic Conductivity Fields
by Wenfeng Gao, Yawei Kou, Hao Dong, Haoran Liu and Simin Jiang
Water 2025, 17(17), 2617; https://doi.org/10.3390/w17172617 - 4 Sep 2025
Viewed by 787
Abstract
Effective containment of contaminant plumes in heterogeneous aquifers is critically challenged by the inherent uncertainty in hydraulic conductivity (K). Conventional, deterministic optimization approaches for pump-and-treat (P&T) system design often fail when confronted with real-world geological variability. This study proposes a novel robust simulation-optimization [...] Read more.
Effective containment of contaminant plumes in heterogeneous aquifers is critically challenged by the inherent uncertainty in hydraulic conductivity (K). Conventional, deterministic optimization approaches for pump-and-treat (P&T) system design often fail when confronted with real-world geological variability. This study proposes a novel robust simulation-optimization framework to design reliable hydraulic containment systems that explicitly account for this subsurface uncertainty. The framework integrates the Karhunen–Loève Expansion (KLE) for efficient stochastic representation of heterogeneous K-fields with a Genetic Algorithm (GA) implemented via the pymoo library, coupled with the MODFLOW groundwater flow model for physics-based performance evaluation. The core innovation lies in a multi-scenario assessment process, where candidate well configurations (locations and pumping rates) are evaluated against an ensemble of K-field realizations generated by KLE. This approach shifts the design objective from optimality under a single scenario to robustness across a spectrum of plausible subsurface conditions. A structured three-step filtering method—based on mean performance, consistency (pass rate), and stability (low variability)—is employed to identify the most reliable solutions. The framework’s effectiveness is demonstrated through a numerical case study. Results confirm that deterministic designs are highly sensitive to the specific K-field realization. In contrast, the robust framework successfully identifies well configurations that maintain a high and stable containment performance across diverse K-field scenarios, effectively mitigating the risk of failure associated with single-scenario designs. Furthermore, the analysis reveals how varying degrees of aquifer heterogeneity influence both the required operational cost and the attainable level of robustness. This systematic approach provides decision-makers with a practical and reliable strategy for designing cost-effective P&T systems that are resilient to geological uncertainty, offering significant advantages over traditional methods for contaminated site remediation. Full article
(This article belongs to the Special Issue Groundwater Quality and Contamination at Regional Scales)
Show Figures

Figure 1

38 pages, 2474 KB  
Article
Generative and Adaptive AI for Sustainable Supply Chain Design
by Sabina-Cristiana Necula and Emanuel Rieder
J. Theor. Appl. Electron. Commer. Res. 2025, 20(3), 240; https://doi.org/10.3390/jtaer20030240 - 4 Sep 2025
Viewed by 881
Abstract
This study explores how the integration of generative artificial intelligence, multi-objective evolutionary optimization, and reinforcement learning can enable sustainable and cost-effective decision-making in supply chain strategy. Using real-world retail demand data enriched with synthetic sustainability attributes, we trained a Variational Autoencoder (VAE) to [...] Read more.
This study explores how the integration of generative artificial intelligence, multi-objective evolutionary optimization, and reinforcement learning can enable sustainable and cost-effective decision-making in supply chain strategy. Using real-world retail demand data enriched with synthetic sustainability attributes, we trained a Variational Autoencoder (VAE) to generate plausible future demand scenarios. These were used to seed a Non-Dominated Sorting Genetic Algorithm (NSGA-II) aimed at identifying Pareto-optimal sourcing strategies that balance delivery cost and CO2 emissions. The resulting Pareto frontier revealed favorable trade-offs, enabling up to 50% emission reductions for only a 10–15% cost increase. We further deployed a deep Q-learning (DQN) agent to dynamically manage weekly shipments under a selected balanced strategy. The reinforcement learning policy achieved an additional 10% emission reduction by adaptively switching between green and conventional transport modes in response to demand and carbon pricing. Importantly, the agent also demonstrated resilience during simulated supply disruptions by rerouting decisions in real time. This research contributes a novel AI-based decision architecture that combines generative modeling, evolutionary search, and adaptive control to support sustainability in complex and uncertain supply chains. Full article
(This article belongs to the Special Issue Digitalization and Sustainable Supply Chain)
Show Figures

Figure 1

18 pages, 2567 KB  
Article
Dynamic Vision-Based Non-Contact Rotating Machine Fault Diagnosis with EViT
by Zhenning Jin, Cuiying Sun and Xiang Li
Sensors 2025, 25(17), 5472; https://doi.org/10.3390/s25175472 - 3 Sep 2025
Viewed by 770
Abstract
Event-based cameras, as a revolutionary class of dynamic vision sensors, offer transformative advantages for capturing transient mechanical phenomena through their asynchronous, per-pixel brightness change detection mechanism. These neuromorphic sensors excel in challenging industrial environments with their microsecond-level temporal resolution, ultra-low power requirements, and [...] Read more.
Event-based cameras, as a revolutionary class of dynamic vision sensors, offer transformative advantages for capturing transient mechanical phenomena through their asynchronous, per-pixel brightness change detection mechanism. These neuromorphic sensors excel in challenging industrial environments with their microsecond-level temporal resolution, ultra-low power requirements, and exceptional dynamic range that significantly outperform conventional imaging systems. In this way, the event-based camera provides a promising tool for machine vibration sensing and fault diagnosis. However, the dynamic vision data from the event-based cameras have a complex structure, which cannot be directly processed by the mainstream methods. This paper proposes a dynamic vision-based non-contact machine fault diagnosis method. The Eagle Vision Transformer (EViT) architecture is proposed, which incorporates biologically plausible computational mechanisms through its innovative Bi-Fovea Self-Attention and Bi-Fovea Feedforward Network designs. The proposed method introduces an original computational framework that effectively processes asynchronous event streams while preserving their inherent temporal precision and dynamic response characteristics. The proposed methodology demonstrates exceptional fault diagnosis performance across diverse operational scenarios through its unique combination of multi-scale spatiotemporal feature analysis, adaptive learning capabilities, and transparent decision pathways. The effectiveness of the proposed method is extensively validated by the practical condition monitoring data of rotating machines. By successfully bridging cutting-edge bio-inspired vision processing with practical industrial monitoring requirements, this work creates a new paradigm for dynamic vision-based non-contact machinery fault diagnosis that addresses critical limitations of conventional approaches. The proposed method provides new insights for predictive maintenance applications in smart manufacturing environments. Full article
Show Figures

Figure 1

19 pages, 649 KB  
Article
Governing AI Output in Autonomous Driving: Scalable Privacy Infrastructure for Societal Acceptance
by Yusaku Fujii
Future Transp. 2025, 5(3), 116; https://doi.org/10.3390/futuretransp5030116 - 1 Sep 2025
Viewed by 482
Abstract
As the realization of fully autonomous driving becomes increasingly plausible, its rapid development raises serious privacy concerns. At present, while personal information of passengers and pedestrians is routinely collected, its purpose and usage history are rarely disclosed, and pedestrians in particular are effectively [...] Read more.
As the realization of fully autonomous driving becomes increasingly plausible, its rapid development raises serious privacy concerns. At present, while personal information of passengers and pedestrians is routinely collected, its purpose and usage history are rarely disclosed, and pedestrians in particular are effectively deprived of any meaningful control over their privacy. Furthermore, no institutional framework exists to prevent the misuse or abuse of such data by authorized insiders. This study proposes the application of a novel privacy protection framework—Verifiable Record of AI Output (VRAIO)—to autonomous driving systems. VRAIO encloses the entire AI system behind an output firewall, and an independent entity, referred to as the Recorder, conducts purpose-compliance screening for all outputs. The reasoning behind each decision is recorded in an immutable and publicly auditable format. In addition, institutional deterrence is enhanced through penalties for violations and reward systems for whistleblowers. Focusing exclusively on outputs rather than input anonymization or interpretability of internal AI processes, VRAIO aims to reconcile privacy protection with technical efficiency. This study further introduces two complementary mechanisms to meet the real-time operational demands of autonomous driving: (1) pre-approval for designated outputs and (2) unrestricted approval of internal system communication. This framework presents a new institutional model that may serve as a foundation for ensuring democratic acceptance of fully autonomous driving systems. Full article
Show Figures

Figure 1

15 pages, 718 KB  
Article
Digital Citizenship Practices in Chile: A Measurement Approach for University Students
by Miguel Galván-Cabello, Julio Tereucan-Angulo, Claudio Briceño-Olivera, Scarlet Hauri-Opazo, Isidora Nogués-Solano and Paulo Lugo-Rincón
Digital 2025, 5(3), 38; https://doi.org/10.3390/digital5030038 - 26 Aug 2025
Viewed by 767
Abstract
This study evaluated the psychometric properties of the Digital Citizenship Scale in Chilean university students, specifically the factorial structure and its reliability, construct validity, and factorial invariance by sex were analyzed. The sample consisted of 905 students whose average age was 22 years, [...] Read more.
This study evaluated the psychometric properties of the Digital Citizenship Scale in Chilean university students, specifically the factorial structure and its reliability, construct validity, and factorial invariance by sex were analyzed. The sample consisted of 905 students whose average age was 22 years, of which 59.7% were women. The methods used were Exploratory Factor Analysis and Confirmatory Factor Analysis. The result of the exploratory analysis suggested retaining the 26 items of the original scale grouped into five factors. The results of the confirmatory analysis corroborated the original structure of the scale and specified a model of five correlated factors. The reliability analysis indicated a total ordinal alpha of 0.87. The measurement invariance analysis showed that the degree of equivalence of the instrument by sex was plausible at a strict level. The scale provides guidance for institutional decision-making regarding initiatives focused on digital inclusion and participation. It was concluded that the Digital Citizenship Scale presents adequate psychometric properties for its use in Chilean university students. Full article
Show Figures

Figure 1

19 pages, 1347 KB  
Article
Enhancing MUSIC’s Capability for Performance Evaluation and Optimization of Established Urban Constructed Wetlands
by Fujia Yang, Shirley Gato-Trinidad and Iqbal Hossain
Hydrology 2025, 12(8), 219; https://doi.org/10.3390/hydrology12080219 - 18 Aug 2025
Viewed by 1319
Abstract
The Model for Urban Stormwater Improvement Conceptualization (MUSIC) serves as a key hydrological tool for simulating urban stormwater runoff pollution and evaluating the treatment performance in Water-Sensitive Urban Designs like constructed wetlands (CWs). However, a significant limitation exists in MUSIC’s current inability to [...] Read more.
The Model for Urban Stormwater Improvement Conceptualization (MUSIC) serves as a key hydrological tool for simulating urban stormwater runoff pollution and evaluating the treatment performance in Water-Sensitive Urban Designs like constructed wetlands (CWs). However, a significant limitation exists in MUSIC’s current inability to model heavy metal contaminants, even though they are commonly found in urban stormwater and pose significant environmental risks. This eventually affects the model’s utility during critical planning phases for urban developments. Thus, there is a need to address this limitation. Field investigations were conducted across established CWs in residential and industrial catchments throughout Greater Melbourne, Australia. Through systematic monitoring and calibration, an approach was developed to extend MUSIC’s predictive capabilities to include several prevalent heavy metals. The results indicate that the enhanced model can generate plausible estimates for targeted metals while differentiating catchment-specific pollutant generation and treatment patterns. This advancement enhances MUSIC’s functionality as a planning support tool, enabling the preliminary assessment of heavy metal dynamics alongside conventional pollutants during both design and operational stages. The findings underscore the value of incorporating metal-specific parameters into stormwater models, offering improved support for urban water management decisions and long-term water quality protection. Full article
(This article belongs to the Special Issue Advances in Urban Hydrology and Stormwater Management)
Show Figures

Figure 1

20 pages, 8763 KB  
Article
An Integrated Approach to Real-Time 3D Sensor Data Visualization for Digital Twin Applications
by Hyungki Kim and Hyowon Suh
Electronics 2025, 14(15), 2938; https://doi.org/10.3390/electronics14152938 - 23 Jul 2025
Viewed by 882
Abstract
Digital twin technology is emerging as a core technology that models physical objects or systems in a digital space and links real-time data to accurately reflect the state and behavior of the real world. For the effective operation of such digital twins, high-performance [...] Read more.
Digital twin technology is emerging as a core technology that models physical objects or systems in a digital space and links real-time data to accurately reflect the state and behavior of the real world. For the effective operation of such digital twins, high-performance visualization methods that support an intuitive understanding of the vast amounts of data collected from sensors and enable rapid decision-making are essential. The proposed system is designed as a balanced 3D monitoring solution that prioritizes intuitive, real-time state observation. Conventional 3D-simulation-based systems, while offering high physical fidelity, are often unsuitable for real-time monitoring due to their significant computational cost. Conversely, 2D-based systems are useful for detailed analysis but struggle to provide an intuitive, holistic understanding of multiple assets within a spatial context. This study introduces a visualization approach that bridges this gap. By leveraging sensor data, our method generates a physically plausible representation 3D CAD models, enabling at-a-glance comprehension in a visual format reminiscent of simulation analysis, without claiming equivalent physical accuracy. The proposed method includes GPU-accelerated interpolation, the user-selectable application of geodesic and Euclidean distance calculations, the automatic resolution of CAD model connectivity issues, the integration of Physically Based Rendering (PBR), and enhanced data interpretability through ramp shading. The proposed system was implemented in the Unity3D environment. Through various experiments, it was confirmed that the system maintained high real-time performance, achieving tens to hundreds of Frames Per Second (FPS), even with complex 3D models and numerous sensor data. Moreover, the application of geodesic distance yielded a more intuitive representation of surface-based phenomena, while PBR integration significantly enhanced visual realism, thereby enabling the more effective analysis and utilization of sensor data in digital twin environments. Full article
(This article belongs to the Section Computer Science & Engineering)
Show Figures

Figure 1

Back to TopTop