Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (188)

Search Parameters:
Keywords = logic mining

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
22 pages, 2223 KB  
Article
Research on the Human–Machine System Efficiency in Deep Mining Under the Coupling Effect of Multiple Factors
by Duiming Guo, Guoqing Li, Ningting Li, Hongtu Xu and Yunlong Li
Processes 2026, 14(7), 1116; https://doi.org/10.3390/pr14071116 - 30 Mar 2026
Viewed by 278
Abstract
Currently, deep mining has become the development trend of underground mines, and the harsh working environment underground seriously affects the efficiency of personnel and equipment operations. The operational efficiency of the human–machine system composed of personnel and equipment is not only affected by [...] Read more.
Currently, deep mining has become the development trend of underground mines, and the harsh working environment underground seriously affects the efficiency of personnel and equipment operations. The operational efficiency of the human–machine system composed of personnel and equipment is not only affected by the status of personnel and equipment, but also closely related to the interaction between human–machine–environment. How to ensure the efficient operation of human–machine systems has become the key to improving the quality and efficiency of mines. Therefore, in order to analyze the interaction relationship between human–machine–environment in the process of human–machine system operation and explore the variation law of human–machine system efficiency. This paper constructs a deep mining human–machine system efficiency system dynamics model under the multi-factor coupling effect of deep well mining, guided by system dynamics theory, and obtains the variation laws of system efficiency under single-factor changes and multi-factor coupling effects. The research results solve the problem of difficulty in quantitatively describing the logical and quantitative relationships between various elements in the study of human–machine system efficiency, providing new ideas for the study of underground work efficiency. Through mathematical modeling, the temperature threshold for the efficient operation of the human–machine system is determined, and the quantitative relationships among temperature, humidity, and wind speed are elaborated, providing a reference for ensuring the efficient operation of the human–machine system in deep mining. Full article
Show Figures

Figure 1

21 pages, 8746 KB  
Article
A Hybrid STPA-BN Framework for Quantitative Risk Assessment of Runway Incursions: A Case Study of the Austin–Bergstrom Incident
by Yujiang Feng, Weijun Pan, Rundong Wang, Yanqiang Jiang, Dajiang Song and Xiqiao Dai
Appl. Sci. 2026, 16(6), 2711; https://doi.org/10.3390/app16062711 - 12 Mar 2026
Viewed by 283
Abstract
The escalating complexity of airport surface operations challenges traditional risk quantification methods. Conventional linear models often fail to capture the non-linear interactions within sociotechnical systems. While hybrid System-Theoretic Process Analysis (STPA) and Bayesian Network (BN) models provide an alternative, existing integrations are frequently [...] Read more.
The escalating complexity of airport surface operations challenges traditional risk quantification methods. Conventional linear models often fail to capture the non-linear interactions within sociotechnical systems. While hybrid System-Theoretic Process Analysis (STPA) and Bayesian Network (BN) models provide an alternative, existing integrations are frequently constrained by ad hoc structural translations and rare-event data sparsity. To address these methodological limitations, this study proposes an enhanced STPA-BN framework. A formalized mapping mechanism (M1–M4) translates qualitative STPA scenarios into a BN topology to quantify non-linear causal dependencies across environmental precursors, operator cognitive states, unsafe control actions, and systemic hazards. Parameterization is achieved via a logic-guided strategy, fusing historical incident data mining with deterministic physical constraints to correct rare-event probabilities. The framework is validated through a reconstruction of the 2023 Austin–Bergstrom runway incursion incident. Results indicate that under low visibility and degraded surveillance, incursion probability escalates to 86%. Sensitivity analysis reveals that while restoring surveillance infrastructure reduces collision risk by ~13%, communication compliance improvements prove insufficient in sensory-deprived environments. These findings quantitatively demonstrate that administrative controls cannot substitute for robust engineering safeguards in complex operations. Full article
Show Figures

Figure 1

14 pages, 417 KB  
Article
An Architectural Optimization Framework for Scalable Spatial Clustering in High-Redundancy Environments
by Carlos Roberto Valêncio, Wellington Reguera Gouveia, Geraldo Francisco Donegá Zafalon, Angelo Cesar Colombini, Mario Luiz Tronco and Tiago Luís de Andrade
Technologies 2026, 14(3), 171; https://doi.org/10.3390/technologies14030171 - 10 Mar 2026
Viewed by 281
Abstract
Spatial Big Data mining is often hindered by high computational complexity and the intrinsic autocorrelation of georeferenced records. To address these challenges, this study proposes an architectural optimization framework for the CHSMST+ algorithm, designated as CHSMST+MR. Rather than introducing a brand-new clustering paradigm, [...] Read more.
Spatial Big Data mining is often hindered by high computational complexity and the intrinsic autocorrelation of georeferenced records. To address these challenges, this study proposes an architectural optimization framework for the CHSMST+ algorithm, designated as CHSMST+MR. Rather than introducing a brand-new clustering paradigm, the framework focuses on a Distributed Spatial Cardinality Reduction (DSCR) layer that aggregates redundant spatial records before the core iterative mining logic begins. By transforming raw records into a weighted key-value representation within the Apache Spark environment, the proposed approach significantly mitigates the shuffling bottleneck common in distributed systems. Experimental validation using high-density biological datasets demonstrates an average execution-time reduction of 51.36%, with performance gains reaching up to 79.96% in specific high-redundancy scenarios. The results, obtained through controlled local emulation, confirm that this architectural optimization provides a scalable, deterministic, and lossless solution for accelerating spatial clustering. This work contributes a methodological path for enhancing the performance of iterative spatial mining algorithms in environments characterized by massive data density and coordinate redundancy. Full article
Show Figures

Figure 1

17 pages, 1530 KB  
Article
Compatibility for Large-Region Gas Extraction Technology in the Baode Coal Mine
by Xinjiang Luo, Lijun Jiang and Huazhou Huang
Energies 2026, 19(5), 1272; https://doi.org/10.3390/en19051272 - 4 Mar 2026
Viewed by 243
Abstract
To address the challenges of designing geologically compatible, large-scale gas drainage strategies in gassy coal mines, this study introduces an integrated workflow combining detailed gas-geological unit subdivision with the Analytic Hierarchy Process (AHP) for the Baode Coal Mine. This approach aims to transform [...] Read more.
To address the challenges of designing geologically compatible, large-scale gas drainage strategies in gassy coal mines, this study introduces an integrated workflow combining detailed gas-geological unit subdivision with the Analytic Hierarchy Process (AHP) for the Baode Coal Mine. This approach aims to transform gas drainage technology selection from empirical judgment to a systematic, quantitative decision-making process, thereby enhancing control precision and mine safety. First, the No. 8 coal seam was refined into ten distinct gas-geological units (II-i to II-x), forming the foundation for a targeted management strategy. For these units, a quantitative evaluation index system was constructed, integrating key factors such as permeability, structural characteristics, and unit area. The AHP was then employed to assess the adaptability of four primary drainage technologies: ULB-uni/bi (underground long borehole unidirectional/bidirectional drainage), UULB (underground ultra-long directional borehole drainage), UDLB-SHF (underground directional long borehole drainage with staged hydraulic fracturing), and FHWS (fractured horizontal wells drilled from the surface). The decision analysis reveals significant regional differentiation in technical suitability. FHWS ranks highest in structurally complex and water-rich zones. UDLB-SHF and UULB serve as viable, cost-effective alternatives to FHWS in various scenarios, with UULB being particularly advantageous for “large-area pre-drainage” in extensive panels with relatively simple geology. ULB-uni/bi is confirmed as the most economical option but is suitable only for minor blocks with simple conditions. Consequently, the study proposes a hierarchical, zone-specific strategy: prioritizing surface-based FHWS for high-risk zones, employing UDLB-SHF for active permeability enhancement in low-permeability resource-rich areas, utilizing UULB for efficient large-area drainage, and restricting ULB-uni/bi to small, geologically normal blocks. Ultimately, this research establishes a robust technical selection system that integrates fine geological subdivision, AHP-based multi-criteria evaluation, and targeted technology matching. It provides a scientific basis for balancing risk control and cost optimization in gas drainage design for the Baode Coal Mine. In summary, the methodological framework proposed in this study provides a systematic approach for coal mine gas control under complex geological conditions. Its core value lies in achieving the unity of scientificity and practicality in gas control technology decisions through standardized analysis logic and differentiated adaptation mechanisms, thereby providing support for the precise and efficient development of coal mine gas control. Full article
Show Figures

Figure 1

39 pages, 2472 KB  
Review
Beyond the Comfort Zone: A Review and Gap Analysis of Fuzzing in Smart City IoT Ecosystems
by Qiao Li and Kai Gao
Information 2026, 17(3), 218; https://doi.org/10.3390/info17030218 - 24 Feb 2026
Viewed by 355
Abstract
With the widespread application of Internet of Things (IoT) technology in smart cities, its security issues have become increasingly prominent. Fuzzing, as an efficient automated vulnerability discovery technique, has been widely used in IoT security assessment. However, current research mostly focuses on general [...] Read more.
With the widespread application of Internet of Things (IoT) technology in smart cities, its security issues have become increasingly prominent. Fuzzing, as an efficient automated vulnerability discovery technique, has been widely used in IoT security assessment. However, current research mostly focuses on general IoT environments or specific device types, lacking a systematic analysis of the complex, dynamic, and deeply integrated context of smart cities. This paper presents a review and integration of 42 representative IoT fuzzing studies published between 2021 and 2025, analyzed via an eight-dimensional analytical framework. It reveals significant gaps with reports on real-world attacks on the IoT systems between current research and the practical security needs of smart cities across three dimensions: device, protocol, and methodology. Based on this, this paper innovatively proposes: (1) an Observability-Complexity Based IoT Device Classification Model based on device observability and business logic complexity, providing a navigation chart for migrating testing capabilities across devices; (2) a technology migration framework based on protocol feature matching, facilitating rapid coverage of emerging and vertical protocols; (3) a methodological evolution path from “vulnerability mining” to “system resilience probing.” This research aims to promote the future role of IoT fuzzing in the assessment and assurance of smart city security resilience by providing structured analytical tools and clear research directions. Full article
(This article belongs to the Special Issue IoT-Based Systems for Resilient Smart Cities)
Show Figures

Figure 1

48 pages, 3619 KB  
Article
Comparative Assessment of the Reliability of Non-Recoverable Subsystems of Mining Electronic Equipment Using Various Computational Methods
by Nikita V. Martyushev, Boris V. Malozyomov, Anton Y. Demin, Alexander V. Pogrebnoy, Georgy E. Kurdyumov, Viktor V. Kondratiev and Antonina I. Karlina
Mathematics 2026, 14(4), 723; https://doi.org/10.3390/math14040723 - 19 Feb 2026
Viewed by 396
Abstract
The assessment of reliability in non-repairable subsystems of mining electronic equipment represents a computationally challenging problem, particularly for complex and highly connected structures. This study presents a systematic comparative analysis of several deterministic approaches for reliability estimation, focusing on their computational efficiency, accuracy, [...] Read more.
The assessment of reliability in non-repairable subsystems of mining electronic equipment represents a computationally challenging problem, particularly for complex and highly connected structures. This study presents a systematic comparative analysis of several deterministic approaches for reliability estimation, focusing on their computational efficiency, accuracy, and applicability. The investigated methods include classical boundary techniques (minimal paths and cuts), analytical decomposition based on the Bayes theorem, the logic–probabilistic method (LPM) employing triangle–star transformations, and the algorithmic Structure Convolution Method (SCM), which is based on matrix reduction of the system’s connectivity graph. The reliability problem is formally represented using graph theory, where each element is modeled as a binary variable with independent failures, which is a standard and practically justified assumption for power electronic subsystems operating without common-cause coupling. Numerical experiments were carried out on canonical benchmark topologies—bridge, tree, grid, and random connected graphs—representing different levels of structural complexity. The results demonstrate that the SCM achieves exact reliability values with up to six orders of magnitude acceleration compared to the LPM for systems containing more than 20 elements, while maintaining polynomial computational complexity. Qualitatively, the compared approaches differ in the nature of the output and practical applicability: boundary methods provide fast interval estimates suitable for preliminary screening, whereas decomposition may exhibit a systematic bias for highly connected (non-series–parallel) topologies. In contrast, the SCM consistently preserves exactness while remaining computationally tractable for medium and large sparse-to-moderately dense graphs, making it preferable for repeated recalculations in design and optimization workflows. The methods were implemented in Python 3.7 using NumPy and NetworkX, ensuring transparency and reproducibility. The findings confirm that the SCM is an efficient, scalable, and mathematically rigorous tool for reliability assessment and structural optimization of large-scale non-repairable systems. The presented methodology provides practical guidelines for selecting appropriate reliability evaluation techniques based on system complexity and computational resource constraints. Full article
Show Figures

Figure 1

20 pages, 1395 KB  
Article
Frontier Dependence in Brazil’s Commodity Exports: Comparing Brazil’s Legal Amazon Sourcing for the EU and China in Light of the EU–Mercosur Partnership Agreement
by Igor Olech, Katarzyna Kosior and Katarzyna Krupska
Sustainability 2026, 18(4), 2063; https://doi.org/10.3390/su18042063 - 18 Feb 2026
Viewed by 500
Abstract
This study investigates the spatial exposure of Brazil’s Legal Amazon (BLA) as the deforestation frontier, operationalized as Brazil’s legally defined Amazon Legal administrative region, in Brazil’s commodity exports to its two largest partners: the European Union (EU) and China. Focusing on agricultural, forestry [...] Read more.
This study investigates the spatial exposure of Brazil’s Legal Amazon (BLA) as the deforestation frontier, operationalized as Brazil’s legally defined Amazon Legal administrative region, in Brazil’s commodity exports to its two largest partners: the European Union (EU) and China. Focusing on agricultural, forestry and mining commodity groups, a destination-specific Relative Concentration Ratio (RCR) and Compound Annual Growth Rate (CAGR) on physical trade data (2002–2024) were used to examine whether contrasting trade governance logics—the regulatory “Brussels Effect” and the scale-driven “Beijing Effect”—are associated with different sourcing geographies from the BLA frontier. We test three competing expectations: EU spatial avoidance, higher Chinese frontier dependence, and compliance-driven consolidation. The results reveal a counterintuitive paradox: despite stricter sustainability governance, the EU displays persistently higher frontier dependence than China in key commodity groups, with RCR trajectories indicating stabilization rather than spatial avoidance. In contrast, China’s frontier dependence declines over time in selected sectors even as import volumes expand substantially, highlighting that changes in frontier exposure cannot be inferred from trade scale alone. CAGR patterns further show strong growth in China-related trade at the national level across commodity groups, alongside sector-specific frontier dynamics within BLA. Overall, the findings provide the strongest support for the consolidation hypothesis: compliance and traceability requirements—public and private—may concentrate EU-linked sourcing among highly auditable, capitalized producers embedded in established frontier zones. These results imply that without explicit spatial targeting, demand-side regulations such as the EUDR may improve product-level assurances yet fail to induce a geographic shift away from deforestation frontiers, potentially reinforcing trade links with established producers in high-risk regions. Full article
Show Figures

Figure 1

18 pages, 2817 KB  
Article
Diagnostic Analytics Powered by IoT and Machine Learning for the Fault Evaluation of a Heavy-Industry Gearbox
by Ernesto Primera, Daniel Fernández and Alvaro Rodríguez-Prieto
Machines 2026, 14(2), 187; https://doi.org/10.3390/machines14020187 - 6 Feb 2026
Viewed by 542
Abstract
Predictive maintenance based on vibration monitoring can significantly improve gearbox reliability in heavy-industry environments. Although it is well established in vibration engineering that operating regimes influence vibration levels, the contribution of this work lies in providing an integrated, data-driven diagnostic linkage between continuously [...] Read more.
Predictive maintenance based on vibration monitoring can significantly improve gearbox reliability in heavy-industry environments. Although it is well established in vibration engineering that operating regimes influence vibration levels, the contribution of this work lies in providing an integrated, data-driven diagnostic linkage between continuously acquired IoT vibration indicators and key process/operational variables to identify and quantify the dominant drivers of vibration escalation. This study deployed wireless IoT sensors for continuous acquisition of RMS vibration and lubrication temperature in gearboxes operating in cement and mining plants and applied multivariate machine learning models to detect anomalies and identify root causes. We compared boosted multilayer feedforward neural networks, boosted trees, and k-nearest neighbors using RMS vibration and process variables including mill feed, lubrication pressures, and temperature. The boosted neural network delivered superior validation performance and isolated low or near-zero mill feed during operation as the primary driver of elevated RMS vibration, with lubrication instability acting as a secondary interacting factor. This shifts the diagnosis from a generic “high vibration during transients” statement to actionable operational mitigations—minimum feed set-points, controlled ramping logic, and lubrication pressure governance—supported by multivariate evidence. Our results motivate further validation with k-fold and out-of-time tests. Full article
(This article belongs to the Special Issue Machines and Applications—New Results from a Worldwide Perspective)
Show Figures

Figure 1

24 pages, 7524 KB  
Article
Bridging the Semantic Gap in BIM Interior Design: A Neuro-Symbolic Framework for Explainable Scene Completion
by Junfu Feng, Ruidan Luo, Xuechao Li, Xiaoping Zhou, Mengmeng Wang, Jiaqi Yin and Hong Yuan
Appl. Sci. 2026, 16(3), 1530; https://doi.org/10.3390/app16031530 - 3 Feb 2026
Viewed by 419
Abstract
Building information modeling (BIM)-based interior design automation remains constrained by a semantic mismatch: engineering constraints are explicit and categorical, whereas aesthetic style is implicit, contextual, and difficult to formalize. As a result, existing systems often overfit local visual similarity or rely on rigid [...] Read more.
Building information modeling (BIM)-based interior design automation remains constrained by a semantic mismatch: engineering constraints are explicit and categorical, whereas aesthetic style is implicit, contextual, and difficult to formalize. As a result, existing systems often overfit local visual similarity or rely on rigid rules, producing recommendations that drift stylistically at the scene level or conflict with professional design logic. This paper proposes KsDesign, a neuro-symbolic framework for interpretable, retrieval-based BIM scene completion that unifies visual style perception with explicit design knowledge. Offline, KsDesign mines category-level co-occurrence and compatibility patterns from curated designer-quality interiors and encodes them as a weighted Furniture-Matching Knowledge Graph (FMKG). Online, it learns style representations exclusively from BIM-derived 2D renderings/projections of 3D family models and BIM scenes, and applies a knowledge-guided attention mechanism to weight contextual furniture cues, synthesizing a global scene-style representation for candidate ranking and retrieval. In a Top-3 (K = 3) evaluation on 10 BIM test scenes with a 20-expert consensus ground truth, KsDesign consistently outperforms single-modal baselines, achieving 86.7% precision in complex scenes and improving average precision by 23.5% (up to 40%), with a 15.5% average recall increase. These results suggest that global semantic constraints can serve as a logical regularizer, mitigating the local biases of purely visual matching and yielding configurations that are both aesthetically coherent and logically valid. We further implement in-authoring explainability within Revit, exposing KG-derived influence weights and evidence paths to support rationale inspection and immediate family insertion. Finally, the knowledge priors and traceable intermediate representations provide a robust substrate for integration with LLM-driven conversational design agents, enabling constraint-aware, verifiable generation and interactive iteration. Full article
Show Figures

Figure 1

31 pages, 4094 KB  
Article
A Meteorological Data Quality Control Framework for Tea Plantations Using Association Rules Mined from ERA5 Reanalysis Data
by Zhongqiu Zhang, Pingping Li and Jizhang Wang
Agriculture 2026, 16(2), 226; https://doi.org/10.3390/agriculture16020226 - 15 Jan 2026
Cited by 1 | Viewed by 343
Abstract
Meteorological data from automatic weather stations (AWS) in tea plantations is critical for agricultural management, but is often compromised by sensor errors and physical implausibilities that traditional quality control (QC) methods fail to detect. This study proposes a novel, meteorologically informed QC framework [...] Read more.
Meteorological data from automatic weather stations (AWS) in tea plantations is critical for agricultural management, but is often compromised by sensor errors and physical implausibilities that traditional quality control (QC) methods fail to detect. This study proposes a novel, meteorologically informed QC framework that mines association rules from long-term ERA5 reanalysis data (2012–2023) using the Apriori algorithm to establish a knowledge base of normal multivariate atmospheric patterns. A comprehensive feature engineering process generated temporal, physical, and statistical features, which were discretized using meteorological thresholds. The mined rules were filtered, prioritized, and integrated with hard physical constraints. The system employs a fuzzy logic mechanism for violation assessment and a weighted anomaly scoring system for classification. When validated on a synthetic dataset with injected anomalies, the method significantly outperformed traditional QC techniques, achieving an F1-score of 0.878 and demonstrating a superior ability to identify complex physical inconsistencies. Application to an independent historical dataset from a Zhenjiang tea plantation (2008–2016) successfully identified 14.6% anomalous records, confirming the temporal transferability and robustness of the approach. This framework provides an accurate, interpretable, and scalable solution for enhancing the quality of meteorological data, with direct implications for improving the reliability of frost prediction and pest management in precision agriculture. Full article
(This article belongs to the Section Artificial Intelligence and Digital Agriculture)
Show Figures

Figure 1

12 pages, 279 KB  
Perspective
Energy Demand, Infrastructure Needs and Environmental Impacts of Cryptocurrency Mining and Artificial Intelligence: A Comparative Perspective
by Marian Cătălin Voica, Mirela Panait and Ștefan Virgil Iacob
Energies 2026, 19(2), 338; https://doi.org/10.3390/en19020338 - 9 Jan 2026
Viewed by 1484
Abstract
This perspective paper aims to set the stage for current development in the field of energy consumption and environmental impacts in two major digital industries: cryptocurrency mining and artificial intelligence (AI). To better understand current developments, this paper uses a comparative analytical framework [...] Read more.
This perspective paper aims to set the stage for current development in the field of energy consumption and environmental impacts in two major digital industries: cryptocurrency mining and artificial intelligence (AI). To better understand current developments, this paper uses a comparative analytical framework of life-cycle assessment principles and high-resolution grid modeling to explore the energy impacts from academic and industry data. On the one hand, while both sectors convert energy into digital value, they operate according to completely different logics, in the sense that cryptocurrencies rely on specialized hardware (application-specific integrated circuits) and seek cheap energy, where they can function as “virtual batteries” for the network, quickly shutting down at peak times, with increasing hardware efficiency. On the other hand, AI is a much more rigid emerging energy consumer, in the sense that it needs high-quality, uninterrupted energy and advanced infrastructure for high-performance Graphics Processing Units (GPUs). The training and inference stages generate massive consumption, difficult to quantify, and AI data centers put great pressure on the electricity grid. In this sense, the transition from mining to AI is limited due to differences in infrastructure, with the only reusable advantage being access to electrical capacity. Regarding competition between the two industries, this dynamic can fragment the energy grid, as AI tends to monopolize quality energy, and how states will manage this imbalance will influence the energy and digital security of the next decade. Full article
22 pages, 10194 KB  
Article
MBFI-Net: Multi-Branch Feature Interaction Network for Semantic Change Detection
by Qing Ding, Fengyan Wang, Kaiyuan Sun, Weilong Chen, Mingchang Wang and Gui Cheng
Remote Sens. 2026, 18(1), 179; https://doi.org/10.3390/rs18010179 - 5 Jan 2026
Viewed by 647
Abstract
Semantic change detection (SCD) effectively captures ground object transition information within change regions, delivering more comprehensive and detailed results than binary change detection (BCD) tasks. The existing multi-task SCD models enable parallel processing of segmentation and BCD of bi-temporal remote sensing images, but [...] Read more.
Semantic change detection (SCD) effectively captures ground object transition information within change regions, delivering more comprehensive and detailed results than binary change detection (BCD) tasks. The existing multi-task SCD models enable parallel processing of segmentation and BCD of bi-temporal remote sensing images, but they still have shortcomings in feature mining, interaction, and cross-task transfer. To address these limitations, a multi-branch feature interaction network (MBFI-Net) is proposed. MBFI-Net designs parallel encoding branches with attention mechanisms that enhance semantic change perception by jointly modeling global contextual patterns and local details. In addition, MBFI-Net proposes bi-temporal feature interaction (BTFI) and cross-task feature transfer (CTFT) modules to improve feature diversity and representativeness, and combines with prior logical relationship constraints to improve SCD performance. Comparative and ablation studies on the SECOND and Landsat-SCD datasets highlight the superiority and robustness of MBFI-Net, which achieves SeKs of 0.2117 and 0.5543, respectively. Furthermore, MBFI-Net strikes a balance between SCD results and model complexity and has superior detection performance for semantic change categories with a small proportion. Full article
Show Figures

Figure 1

31 pages, 1865 KB  
Article
Research on the Improvement of Intuitionistic Fuzzy Entropy Measurement Based on TOPSIS Method and Its Application
by Xiao-Guo Chen, Wen-Yue Xiao, Ning Chen, Yu-Ze Zhang and Yue Yang
Mathematics 2026, 14(1), 150; https://doi.org/10.3390/math14010150 - 30 Dec 2025
Viewed by 325
Abstract
Aiming at the problem that existing intuitionistic fuzzy entropy measures fail to fully balance the interaction between intuition (determined by hesitation degree) and fuzziness (characterized by the difference between membership degree and non-membership degree), this paper proposes the concept of isentropic arc, reveals [...] Read more.
Aiming at the problem that existing intuitionistic fuzzy entropy measures fail to fully balance the interaction between intuition (determined by hesitation degree) and fuzziness (characterized by the difference between membership degree and non-membership degree), this paper proposes the concept of isentropic arc, reveals the mutual offset effect of the two in entropy composition, and provides a new theoretical perspective for the planar analysis of entropy measures. Further research finds that there are maximum and minimum entropy points in the intuitionistic fuzzy entropy plane. Based on this, two different types of isentropic arcs can be constructed. Combining this feature with the core logic of approaching the ideal solution, this paper constructs a new intuitionistic fuzzy entropy measure formula based on the TOPSIS method. This formula can characterize the synergistic influence of intuition and fuzziness at the same time, meets all the constraints of the axiomatic definition, and is more suitable for the needs of actual decision-making scenarios. Comparative analysis of numerical examples shows that the proposed new entropy measure has significantly better discrimination than existing methods for six groups of samples with a high hesitation degree and high fuzziness, and the entropy value ranking is consistent with the ranking of the uncertainty information contained in the samples. Finally, the weight decision-making model based on this entropy measure is applied to the evaluation of coal mine emergency rescue capability, verifying its practical value in solving complex uncertainty problems. Full article
Show Figures

Figure 1

20 pages, 9502 KB  
Article
Meta-Path-Based Probabilistic Soft Logic for Drug–Target Interaction Predictions
by Shengming Zhang and Yizhou Sun
Mathematics 2025, 13(24), 3958; https://doi.org/10.3390/math13243958 - 12 Dec 2025
Viewed by 517
Abstract
Drug–target interaction (DTI) predictions, which aim to predict whether a drug will be bounded to a target, have received wide attention recently. The goal is to automate and accelerate the costly process of drug design. Most of the recently proposed methods use single [...] Read more.
Drug–target interaction (DTI) predictions, which aim to predict whether a drug will be bounded to a target, have received wide attention recently. The goal is to automate and accelerate the costly process of drug design. Most of the recently proposed methods use single drug–drug similarity and target–target similarity information for DTI predictions; thus, they are unable to take advantage of the abundant information regarding the various types of similarities between these two types of information. Very recently, some methods have been proposed to leverage multi-similarity information; however, they still lack the ability to take into consideration the rich topological information of all sorts of knowledge bases in which the drugs and targets reside. Furthermore, the high computational cost of these approaches limits their scalability to large-scale networks. To address these challenges, we propose a novel approach named summated meta-path-based probabilistic soft logic (SMPSL). Unlike the original PSL framework, which often overlooks the quantitative path frequency, SMPSL explicitly captures crucial meta-path count information. By integrating summated meta-path counts into the PSL framework, our method not only significantly reduces the computational overhead, but also effectively models the heterogeneity of the network for robust DTI predictions. We evaluated SMPSL against five robust baselines on three public datasets. The experimental results demonstrate that our approach outperformed all of the baselines in terms of the AUPR and AUC scores. Full article
Show Figures

Figure 1

19 pages, 1467 KB  
Article
AI-Driven Process Mining for ESG Risk Assessment in Sustainable Management
by Riccardo Censi, Paola Campana, Francesco Bellini, Fulvio Schettino and Chiara De Pucchio
Buildings 2025, 15(23), 4260; https://doi.org/10.3390/buildings15234260 - 25 Nov 2025
Cited by 2 | Viewed by 1380
Abstract
The construction sector faces growing challenges in integrating sustainability, risk management, and regulatory compliance, in line with initiatives such as the European Green Deal, the Corporate Sustainability Reporting Directive, and international building standards. However, the systematic adoption of ESG metrics in decision-making remains [...] Read more.
The construction sector faces growing challenges in integrating sustainability, risk management, and regulatory compliance, in line with initiatives such as the European Green Deal, the Corporate Sustainability Reporting Directive, and international building standards. However, the systematic adoption of ESG metrics in decision-making remains limited due to fragmented data, the lack of predictive tools, and reliance on static reporting. This study proposes and illustrates a digital framework, based on simulated data, that combines Artificial Intelligence, Process Mining, and Robotic Process Automation to enhance ESG risk assessment in sustainable construction management. The model, formalized through Business Process Model and Notation, integrates Machine Learning for risk weighting and classification, and leverages Web Scraping and Business Intelligence for dynamic data acquisition. A simulated case study involving 100 synthetic construction projects is used to demonstrate the internal logic and quantitative feasibility of the framework, showing how automated data integration and predictive modeling can improve the consistency of ESG risk identification and classification. While the results are illustrative rather than empirical, they confirm the analytical coherence and reproducibility of the proposed workflow. From a scientific perspective, it contributes an integrated methodology that bridges predictive analytics and process management for ESG evaluation. From a practical standpoint, it offers a structured and reproducible workflow to anticipate, classify, and mitigate ESG risks, supporting the construction sector’s transition toward data-driven and sustainability-first management practices. Full article
(This article belongs to the Special Issue Applying Artificial Intelligence in Construction Management)
Show Figures

Figure 1

Back to TopTop