Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (5,426)

Search Parameters:
Keywords = robust regression

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
20 pages, 767 KB  
Article
Real-World Adherence to Asthma and COPD Medications in Belgium: A Nationwide Analysis of Determinants Using Dispensing Data and Mixed-Effects Modeling
by Amélie Rosière, Sebastian Riemann, Olfa Guaddoudi, Stéphanie Pochet, Guy Brusselle and Carine De Vriese
Healthcare 2026, 14(8), 982; https://doi.org/10.3390/healthcare14080982 (registering DOI) - 9 Apr 2026
Abstract
Background/Objectives: Therapeutic adherence to asthma and COPD medications remains worryingly low and varies widely across patient groups, underscoring persistent challenges in chronic respiratory care. The aim of this nationwide study is to quantify real-world adherence and to identify its demographic and clinical [...] Read more.
Background/Objectives: Therapeutic adherence to asthma and COPD medications remains worryingly low and varies widely across patient groups, underscoring persistent challenges in chronic respiratory care. The aim of this nationwide study is to quantify real-world adherence and to identify its demographic and clinical determinants using the Belgian health care claims database of the National Institute for Health and Disability Insurance (NIHDI). Methods: Adherence was assessed using the Continuous Multiple Interval Measure of Medication Availability (CMA) among patients treated between 2020 and 2023. Mixed-effects logistic regression was applied to identify determinants of adherence. Results: Only 30.5% of patients achieved good adherence (CMA ≥ 0.8). Adherence varied substantially across pharmacological classes, ranging from 8.1% for inhaled corticosteroids to 66.4% for triple therapy. Age emerged as a major determinant, with adherence increasing steadily across age groups: only 4.0% of children and 15.7% of adolescents reached good adherence, compared with progressively higher rates in adults. Mixed-effects logistic regression confirmed age, sex, and pharmacological class as robust predictors of adherence. Conclusions: These findings highlight the magnitude of the therapeutic adherence gap in chronic respiratory diseases and clearly identify children, adolescents, and ICS or LABA + ICS users as the highest-risk groups. Recognizing these profiles has direct implications for clinical practice, as it provides concrete targets for future patient-centered interventions and guideline-concordant adherence-enhancing strategies. Full article
(This article belongs to the Topic Optimization of Drug Utilization and Medication Adherence)
Show Figures

Figure 1

19 pages, 2474 KB  
Article
Power Laws in Empirical Eigenvalue Spectra
by Benyuan Liu, Yung-Ying Chen, M. Shane Li, Vanessa Thomasin Morgan, Eslam Abdelaleem and Audrey Sederberg
Entropy 2026, 28(4), 418; https://doi.org/10.3390/e28040418 - 9 Apr 2026
Abstract
The critical brain hypothesis proposes that neural systems operate near a phase transition to optimize information processing. A key method for investigating this hypothesis is the phenomenological renormalization group (pRG), which looks for scale-invariant features across levels of coarse-graining. One such feature is [...] Read more.
The critical brain hypothesis proposes that neural systems operate near a phase transition to optimize information processing. A key method for investigating this hypothesis is the phenomenological renormalization group (pRG), which looks for scale-invariant features across levels of coarse-graining. One such feature is the power-law scaling of eigenvalues of covariance matrices of coarse-grained variables. However, the estimation of this scaling exponent, μ, often relies on linear regression over arbitrarily selected ranges of the plot of eigenvalues versus rank. This heuristic “eyeballing” introduces uncontrolled bias and complicates the interpretation of observed scaling relationships. In order to obtain a more robust estimation of μ, we do not fit the standard eigenvalue-vs-rank relationship, but rather the density of eigenvalues, for which standard protocols exist for fitting power laws to empirical data distributions. We demonstrate this approach using a synthetic model that replicates the scaling signatures of neural data while providing control over the system’s exponents as well as neural data obtained from publicly available Neuropixels recordings. We also establish standards for the minimal data required to quantify power-law behavior in a pRG eigenvalue analysis. Our approach contributes a tool for understanding the fundamental limitations imposed by spatial and temporal constraints of experimental datasets, which is required to rigorously assess the neural criticality hypothesis. Full article
(This article belongs to the Special Issue Information-Theoretic Methods in Computational Neuroscience)
Show Figures

Figure 1

12 pages, 322 KB  
Article
Disease Severity of Respiratory Syncytial Virus Infection in Hospitalized Children
by Costanza Di Chiara, Vera Rigamonti, Beatrice Rita Campana, Anna Chiara Vittucci, Livia Antilici, Flaminia Ruberti, Hajrie Seferi, Giulia Brigadoi, Daniele Donà, Alberto Villani, Anna Cantarutti and Susanna Esposito
Viruses 2026, 18(4), 451; https://doi.org/10.3390/v18040451 - 9 Apr 2026
Abstract
Background: Respiratory syncytial virus (RSV) is a leading cause of hospitalization for acute respiratory tract infection (ARTI) in young children. Respiratory viral coinfections are frequently identified in RSV-related ARTIs, yet their impact on disease severity remains controversial and may vary according to [...] Read more.
Background: Respiratory syncytial virus (RSV) is a leading cause of hospitalization for acute respiratory tract infection (ARTI) in young children. Respiratory viral coinfections are frequently identified in RSV-related ARTIs, yet their impact on disease severity remains controversial and may vary according to the co-pathogen involved. In the context of evolving RSV prevention strategies, a clearer understanding of RSV coinfection phenotypes is needed. Methods: We conducted a multicenter retrospective cohort study of children aged ≤ 5 years hospitalized for ARTI at two Italian tertiary-care pediatric hospitals between 1 September 2022 and 30 April 2025. Children with laboratory-confirmed RSV infection detected by multiplex polymerase chain reaction were included. Patients were classified as having RSV monoinfection, RSV–rhinovirus coinfection, or RSV–non-rhinovirus coinfection. Severe disease was defined as a composite outcome including intensive care unit (ICU) admission, need for respiratory or hemodynamic support, or death. Association between infection status and severe disease was evaluated using a Poisson regression model with robust variance, adjusted for age, sex, and comorbidities. Results: Among 231 RSV-related hospitalizations, 118 (51.1%) were classified as RSV monoinfection, 65 (28.1%) as RSV–rhinovirus coinfection, and 48 (20.8%) as RSV–non-rhinovirus coinfection. Children with RSV–rhinovirus coinfection were older and had shorter hospital stays. Severe disease occurred in 80.5% of RSV monoinfections, 70.8% of RSV–rhinovirus coinfections, and 75.0% of RSV–non-rhinovirus coinfections. After adjustment, neither RSV–rhinovirus coinfection (adjusted risk ratio [aRR]: 0.93; 95% confidence interval [95% CI]: 0.80–1.13) nor RSV–non-rhinovirus coinfection (aRR: 0.99; 95% CI: 0.83–1.18) was associated with increased disease severity compared with RSV monoinfection. Conclusions: RSV–rhinovirus and RSV–non-rhinovirus coinfections were not associated with greater disease severity compared with RSV monoinfection in hospitalized children. These findings support pathogen-specific interpretation of multiplex diagnostic results and inform clinical risk stratification in the era of expanding RSV prevention strategies. Full article
(This article belongs to the Section Viral Immunology, Vaccines, and Antivirals)
Show Figures

Figure 1

22 pages, 4782 KB  
Article
Nondestructive Detection of Eggshell Thickness Using Near-Infrared Spectroscopy Based on GBDT Feature Selection and an Improved CatBoost Algorithm 
by Ziqing Li, Ying Ji, Changheng Zhao, Dehe Wang and Rongyan Zhou
Foods 2026, 15(8), 1286; https://doi.org/10.3390/foods15081286 - 8 Apr 2026
Abstract
Eggshell thickness is a critical indicator for evaluating egg breakage resistance and hatchability, yet traditional measurement methods remain destructive and inefficient. To address this, this study proposes a robust prediction approach by integrating Gradient Boosting Decision Tree (GBDT) feature optimization with an improved [...] Read more.
Eggshell thickness is a critical indicator for evaluating egg breakage resistance and hatchability, yet traditional measurement methods remain destructive and inefficient. To address this, this study proposes a robust prediction approach by integrating Gradient Boosting Decision Tree (GBDT) feature optimization with an improved CatBoost algorithm. First, a joint strategy of Standard Normal Variate (SNV) and Multiplicative Scatter Correction (MSC) was employed to eliminate spectral scattering noise and enhance organic matrix fingerprint information. Subsequently, GBDT was introduced for nonlinear feature evaluation to adaptively screen the top 50 wavelengths, effectively mitigating the “curse of dimensionality” and multicollinearity in full-spectrum data. A CatBoost regression model was then constructed using an Ordered Boosting mechanism, supported by a dual anti-overfitting strategy that merged 10-fold nested cross-validation with Bootstrap resampling. Experimental results demonstrate that this method significantly outperforms traditional algorithms in both prediction accuracy and generalization. The coefficients of determination (R2) for the calibration and prediction sets reached 0.930 and 0.918, respectively, with a root mean square error of prediction (RMSEP) of 0.008 mm. Residual analysis confirms that prediction errors follow a zero-mean Gaussian distribution, indicating that systematic bias was effectively eliminated. This research provides a reliable theoretical foundation and technical support for the intelligent grading of poultry egg quality. Full article
(This article belongs to the Section Food Analytical Methods)
18 pages, 1682 KB  
Article
Revolutionizing Pediatric Myopia Care: A Machine Learning Approach for Rapid and Accurate Pre-clinical Screening
by Siqi Zhang and Qi Zhao
J. Clin. Med. 2026, 15(8), 2834; https://doi.org/10.3390/jcm15082834 - 8 Apr 2026
Abstract
Background/Objective: Myopia has become a prominent public health issue in China, significantly impacting the visual health of children and adolescents. The condition is characterized by a high incidence rate, increasing prevalence, and a trend toward earlier onset, highlighting the critical need for early [...] Read more.
Background/Objective: Myopia has become a prominent public health issue in China, significantly impacting the visual health of children and adolescents. The condition is characterized by a high incidence rate, increasing prevalence, and a trend toward earlier onset, highlighting the critical need for early and accurate diagnosis. Current clinical diagnostic methods primarily depend on subjective evaluations by optometrists and the use of isolated parameters, leading to inefficiencies and inconsistent outcomes. Moreover, there remains a lack of diagnostic tools that can effectively integrate multi-parameter analysis while ensuring robust data privacy protection. This study aims to develop an artificial intelligence (AI) diagnostic model that achieves objective, accurate, and safe diagnosis of myopia in children without cycloplegia through multi-parameter fusion and to enable local deployment. The proposed model is intended to be a reliable tool for clinical applications and large-scale screening projects, while ensuring strong protection of patient privacy. Methods: We built a transparent, rule-driven AI framework using clinical guidelines. Key ocular parameters—visual acuity, spherical equivalent, axial length, corneal curvature, and axial ratio—were encoded as logical rules in Python and incorporated via instruction fine-tuning. The model was trained and validated on retrospective clinical data (70% training, 15% validation, 15% test) using five algorithms: gradient boosting, logistic regression, random forest, SVM, and XGBoost. Performance was evaluated using accuracy, precision, recall, F1 score, and mean AUC across classes. Results: The model classifies refractive status into five categories: hyperopia, pre-myopia, mild, moderate, and high myopia. All five different algorithms demonstrated excellent diagnostic and classification performance. Gradient boosting achieved the best overall performance, with an accuracy of 98.67%, an F1 score of 98.67%, and a mean AUC of 0.957—outperforming all other models. Conclusions: This study successfully developed an artificial intelligence-based myopia diagnosis system for children under non-dilated pupil conditions. The system is interpretable and privacy-preserving, and has excellent diagnostic and classification performance, making it suitable for clinical decision support and large-scale screening applications. It has great potential to promote the development of early intervention, precision prevention, and control strategies for childhood myopia. Full article
(This article belongs to the Section Ophthalmology)
16 pages, 411 KB  
Article
Task Assignment for Loitering Munitions Based on Predicted Capturability
by Gyuyeon Choi, Seongwook Heu and Hyeong-Geun Kim
Aerospace 2026, 13(4), 347; https://doi.org/10.3390/aerospace13040347 (registering DOI) - 8 Apr 2026
Abstract
This paper proposes a novel task assignment strategy for multiple fixed-wing loitering munitions, focusing on the kinematic capturability of maneuvering ground targets. Compared to rotary-wing UAVs, fixed-wing munitions are subject to significant turning radius constraints and limited maneuverability. Consequently, conventional assignment metrics based [...] Read more.
This paper proposes a novel task assignment strategy for multiple fixed-wing loitering munitions, focusing on the kinematic capturability of maneuvering ground targets. Compared to rotary-wing UAVs, fixed-wing munitions are subject to significant turning radius constraints and limited maneuverability. Consequently, conventional assignment metrics based on relative distance or estimated time-to-go are insufficient to guarantee successful interception. To address this, we adopt a data-driven capturability prediction framework based on Gaussian Process Regression (GPR) and propose a novel task assignment strategy that leverages the predicted capture region as a decision-making criterion. Furthermore, a robustness-centric task assignment algorithm is proposed, which prioritizes interceptors based on the radius of the Maximum Inscribed Circle (MIC) within the predicted capture region. This metric quantifies the safety margin against target maneuvers and environmental uncertainties. Numerical simulations demonstrate that the proposed method significantly outperforms conventional distance-based and time-to-go-based approaches, achieving the highest interception success rate across all tested scenarios including maneuvering target conditions. The results validate that incorporating geometric capturability constraints is essential for the efficient operation of fixed-wing loitering munitions. Full article
(This article belongs to the Special Issue Flight Guidance and Control)
Show Figures

Figure 1

28 pages, 3551 KB  
Article
Machine-Learning-Based Parameterisation of Soil Thermal Conductivity for Shallow Geothermal and Ground Heat Exchanger Modelling
by Mateusz Żeruń, Ewa Jagoda and Edyta Majer
Energies 2026, 19(8), 1827; https://doi.org/10.3390/en19081827 (registering DOI) - 8 Apr 2026
Abstract
Thermal conductivity is a key input parameter in geotechnical and shallow geothermal engineering, directly influencing the design, efficiency, and long-term performance of ground heat exchangers, energy piles, and ground-source heat pump systems. Reliable parameterisation of this property in sandy soils remains challenging due [...] Read more.
Thermal conductivity is a key input parameter in geotechnical and shallow geothermal engineering, directly influencing the design, efficiency, and long-term performance of ground heat exchangers, energy piles, and ground-source heat pump systems. Reliable parameterisation of this property in sandy soils remains challenging due to nonlinear interactions between water content, bulk density, and soil structure. This study develops a machine-learning-based workflow for robust parameterisation of thermal conductivity in quartz-rich sands using a large, internally consistent laboratory dataset comprising 1716 samples, including 1455 moist measurements used for modelling, obtained from nationwide site investigations. Air-dry specimens were identified as laboratory-induced drying states and excluded to restrict the analysis to hydro-mechanical conditions representative of typical shallow subsurface environments. Several regression algorithms representing different modelling strategies were evaluated within a unified and reproducible framework and benchmarked against selected classical empirical formulations. Model performance was assessed using standard accuracy metrics together with diagnostics describing the functional stability of predicted thermal-conductivity surfaces. The results reveal a systematic trade-off between predictive accuracy and functional consistency, indicating that models optimised for accuracy may produce functionally unstable and less suitable parameterisations for engineering applications. Accuracy-optimised models frequently produce locally irregular parameter fields, whereas more strongly regularised models yield smoother and physically more coherent response surfaces. The proposed workflow supports reliable thermal-property parameterisation for geotechnical design and shallow geothermal modelling. Full article
(This article belongs to the Special Issue Advances in Thermal Engineering Research and Applied Technologies)
Show Figures

Figure 1

18 pages, 3641 KB  
Article
A Wavelet-Enhanced Detector for Tiny Objects in Remote-Sensing Images
by Weifan Xu and Yong Hu
Remote Sens. 2026, 18(8), 1109; https://doi.org/10.3390/rs18081109 - 8 Apr 2026
Abstract
Accurate and efficient detection is pivotal for tiny objects in remote sensing. However, achieving a favorable accuracy-efficiency trade-off remains challenging due to the few informative pixels of small targets, frequent occlusions, cluttered backgrounds, and detail degradation introduced by downsampling and multi-scale fusion. To [...] Read more.
Accurate and efficient detection is pivotal for tiny objects in remote sensing. However, achieving a favorable accuracy-efficiency trade-off remains challenging due to the few informative pixels of small targets, frequent occlusions, cluttered backgrounds, and detail degradation introduced by downsampling and multi-scale fusion. To address these challenges, we propose WEYOLO, a wavelet-enhanced detector that explicitly models frequency components and adaptively strengthens high-frequency cues to improve tiny-object robustness while maintaining competitive efficiency in inference speed and model size for remote-sensing deployment. To preserve edges and textures when spatial resolution is reduced, we design a Frequency-Aware Lifting Haar (FaLH) backbone that decomposes features into directional sub-bands and retains them during downsampling, preventing the loss of high-frequency information. Next, to address the blurring and detail loss caused by conventional pooling during multi-scale fusion, we introduce a Frequency-Domain Pyramid-Pooling (FDPP) module that performs wavelet-based multi-resolution analysis for frequency-aware feature-pyramid fusion. Additionally, we propose a stable size-aware quality focal regression loss that unifies Focaler-CIoU and size-aware DFL into a single objective, improving robustness and overall accuracy for small objects. Comprehensive experiments show that WEYOLO improves precision and recall over the baseline by 3.2%/4.2% on VisDrone and 2.6%/9.7% on TT100K; on AI-TOD, it achieves 47.5% mAP@0.5 and 21.3% mAP@0.5:0.95. Meanwhile, it reduces the parameter count by 60%, achieving a strong accuracy-efficiency balance for practical aerial sensing deployment. Full article
(This article belongs to the Section AI Remote Sensing)
Show Figures

Figure 1

19 pages, 7516 KB  
Article
ForSOC-UA: A Novel Framework for Forest Soil Organic Carbon Estimation and Uncertainty Assessment with Multi-Source Data and Spatial Modeling
by Qingbin Wei, Miao Li, Zhen Zhen, Shuying Zang, Hongwei Ni, Xingfeng Dong and Ye Ma
Remote Sens. 2026, 18(8), 1106; https://doi.org/10.3390/rs18081106 - 8 Apr 2026
Abstract
Accurate estimation of forest soil organic carbon (SOC) is considered critical for understanding terrestrial carbon cycling and supporting climate change mitigation strategies. However, the canopy block, intricate vertical structure of forests, and the constraints of single-source remote sensing data have presented considerable obstacles [...] Read more.
Accurate estimation of forest soil organic carbon (SOC) is considered critical for understanding terrestrial carbon cycling and supporting climate change mitigation strategies. However, the canopy block, intricate vertical structure of forests, and the constraints of single-source remote sensing data have presented considerable obstacles for estimating forest SOC. This study proposes a forest SOC estimation and uncertainty analysis (ForSOC-UA) framework to enhance forest SOC estimation and quantify its uncertainty in the natural secondary forests of northern China by integrating hyperspectral imagery (ZY-1F), synthetic aperture radar data (Sentinel-1), and environmental covariates (such as topography, vegetation, and soil indices). The performance of traditional machine learning models (RF, SVM, and CNN), geographically weighted regression (GWR), and a geographically weighted random forest (GWRF) model was compared across three different soil depths (0–5 cm, 5–10 cm, and 10–30 cm). The results showed that GWRF consistently outperformed all other models across all soil depth layers, with the highest accuracy achieved using multi-source data (R2 = 0.58, RMSE = 27.49 g/kg, rRMSE = 0.31). Analysis of feature importance revealed that soil moisture, terrain characteristics, and Sentinel-1 polarization attributes were the primary predictors, while spectral derivatives in the red and near-infrared bands from ZY-1F also played a significant role for forest SOC estimation. The uncertainty analysis indicated a forest SOC estimation uncertainty of 37.2 g/kg in the 0–5 cm soil layer, with a decreasing trend as depth increased. This pattern is associated with the vertical spatial distribution of the measured forest SOC. This integrated approach effectively captures spatial heterogeneity and nonlinear relationships between feature and forest SOC, while also assessing estimation uncertainty, so providing a robust methodology for predicting forest SOC. The ForSOC-UA framework addresses the uncertainty quantification of SOC estimation at different vertical depths based on machine learning, providing methodological enhancements for the assessment of large-scale forest SOC and the monitoring of carbon sinks within forest ecosystems. Full article
Show Figures

Figure 1

28 pages, 1524 KB  
Article
The Impact of Digital–Green Synergy on Firm Innovation Resilience: Evidence from China
by Linzi Zhu and Zaijie Zhang
Sustainability 2026, 18(8), 3661; https://doi.org/10.3390/su18083661 - 8 Apr 2026
Abstract
Innovation is the core driving force behind high-quality development. This study uses a sample of Chinese A-share non-financial listed companies from 2011 to 2024. It empirically examines the impact of digital–green synergy on corporate innovation resilience. We find that digital–green synergy (DG) significantly [...] Read more.
Innovation is the core driving force behind high-quality development. This study uses a sample of Chinese A-share non-financial listed companies from 2011 to 2024. It empirically examines the impact of digital–green synergy on corporate innovation resilience. We find that digital–green synergy (DG) significantly enhances firm innovation resilience. The baseline regression coefficient is 0.031 (p < 0.01). This conclusion remains robust after addressing endogeneity and conducting various robustness checks. Mechanism tests show that digital–green synergy enhances innovation resilience by improving firms’ absorptive capacity, attracting capital market attention, and cultivating both resource and organizational synergy. Heterogeneity analyses reveal that the impact of this dual transformation depends on firms’ specific characteristics and their internal and external environments. This research provides micro-level evidence on the value-creation mechanisms of dual transformation synergy. The findings offer significant insights for supporting corporate innovation systems in navigating uncertainty and achieving high-quality, sustainable development. Full article
Show Figures

Figure 1

18 pages, 1977 KB  
Article
Boosted Logic-Based Fuzzy Granular Networks
by Keun-Chang Kwak
Electronics 2026, 15(8), 1550; https://doi.org/10.3390/electronics15081550 - 8 Apr 2026
Abstract
Granular modeling has emerged as an interpretable framework for nonlinear system representation by constructing clusters of meaningful data units within the input and output domains. Unlike conventional neuro-fuzzy models that yield crisp outputs, granular models generate fuzzy-set-based outputs, preserving uncertainty information. However, traditional [...] Read more.
Granular modeling has emerged as an interpretable framework for nonlinear system representation by constructing clusters of meaningful data units within the input and output domains. Unlike conventional neuro-fuzzy models that yield crisp outputs, granular models generate fuzzy-set-based outputs, preserving uncertainty information. However, traditional granular architectures rely on linear aggregation mechanisms, limiting their expressive power and structural adaptability. This paper proposes a novel framework termed Logic-Based Fuzzy Granular Networks (LFGNs), in which conventional granular models are enhanced through the incorporation of fuzzy logical neurons implementing AND–OR operations. The proposed logic-based structure enables nonlinear interactions among induced granules while maintaining interpretability. To further improve predictive performance, LFGNs are embedded into a boosting framework, forming a boosted LFGN in which each LFGN acts as a weak learner. Extensive simulation studies on benchmark datasets indicate that the proposed approach outperforms conventional granular models and the existing boosting method in terms of regression accuracy. The integration of logical neurons, boosting, and fuzzy granular models provides a unified and robust granular modeling framework. Full article
Show Figures

Figure 1

21 pages, 320 KB  
Article
Xenoepistemics
by Jordi Vallverdú
Philosophies 2026, 11(2), 57; https://doi.org/10.3390/philosophies11020057 - 8 Apr 2026
Abstract
Epistemology remains tacitly anthropocentric: it treats knowledge as something produced and validated through human cognitive capacities such as understanding, intuition, and transparent justification. Yet contemporary science and artificial intelligence increasingly depend on non-human systems that generate mathematically valid results, empirically successful models, and [...] Read more.
Epistemology remains tacitly anthropocentric: it treats knowledge as something produced and validated through human cognitive capacities such as understanding, intuition, and transparent justification. Yet contemporary science and artificial intelligence increasingly depend on non-human systems that generate mathematically valid results, empirically successful models, and operationally reliable inferences that no human can fully survey or interpret. This article develops xenoepistemics, a structural theory of non-anthropocentric knowledge. The central claim is that epistemic evaluation must be reformulated in terms of system-level properties—reliability, robustness, counterfactual sensitivity, and domain transfer—rather than mentalistic notions such as belief or understanding. I offer (i) a definition of xenoepistemic systems as systems that track structure in a target domain without requiring human-style semantic access; (ii) a minimal account of epistemic agency without minds that avoids trivialization; and (iii) a non-circular trust framework that distinguishes empirical success from epistemic legitimacy using independent validation regimes. This paper addresses a reflexive worry—that a human-authored theory cannot dethrone human epistemology—by separating standpoint from object: xenoepistemics is articulated by humans but is not about human cognition. I discuss the pragmatic value of xenoepistemic knowledge production, the limits of independent verification for opaque systems, domain-relative thresholds for xenoepistemic authority, and the problem of constitutionally human-inaccessible knowledge. Finally, I diagnose and formalize the Marcusian regress paradox: recurrent goalpost-shifting, whereby every machine competence is reclassified as irrelevant once achieved. Xenoepistemics reframes this debate by treating non-human knowledge as a present reality requiring new norms, not as a future curiosity. Full article
(This article belongs to the Special Issue Intelligent Inquiry into Intelligence)
Show Figures

Figure 1

22 pages, 2917 KB  
Article
How the Digital Economy Shapes Green and Low-Carbon Development in the Yangtze River Economic Belt
by Jinjiang Chen, Changqing Guo, Xueyu Bai and Ruizhen Liu
Sustainability 2026, 18(8), 3659; https://doi.org/10.3390/su18083659 - 8 Apr 2026
Abstract
Faced with increasingly severe resource shortages and environmental pressures, exploring the impact of the digital economy on green and low-carbon development and its potential mechanisms is of great significance. Drawing on a comprehensive panel dataset spanning the decade from 2014 to 2023, this [...] Read more.
Faced with increasingly severe resource shortages and environmental pressures, exploring the impact of the digital economy on green and low-carbon development and its potential mechanisms is of great significance. Drawing on a comprehensive panel dataset spanning the decade from 2014 to 2023, this study examines 11 provincial administrative regions situated within the Yangtze River Economic Belt in China, systematically examining the effects and underlying pathways of the digital economy on green and low-carbon development. We construct an evaluation index system for the digital economy and green and low-carbon development, and use a two-way fixed effects model, a moderating effect model, and a threshold regression model for empirical analysis. Empirical results show that the digital economy significantly promotes green and low-carbon development, and this conclusion remains robust after a series of robustness tests. Mechanism analysis indicates that green technology innovation plays a significant moderating role, amplifying the environmental benefits of the digital economy; industrial structure upgrading exhibits a double threshold effect, with the promoting effect of the digital economy on green and low-carbon development increasing as the threshold is exceeded. Heterogeneity analysis shows that the ecological effects of the digital economy are significant in the midstream and southwest cluster and in areas with high factor allocation efficiency. We conclude that optimizing the environment for digital economic development, emphasizing innovation in digital green technologies, and implementing differentiated regional and structural policies can achieve a coordinated advancement of digital transformation and green and low-carbon development, providing valuable empirical evidence and policy implications for regional sustainable development. Full article
Show Figures

Figure 1

24 pages, 656 KB  
Article
Digital Technology and Energy Efficiency Enhancement: A Theoretical Framework and Empirical Evidence
by Lianghu Wang, Bin Li and Jun Shao
Energies 2026, 19(8), 1819; https://doi.org/10.3390/en19081819 - 8 Apr 2026
Abstract
Improving energy efficiency is critical for tackling environmental issues and achieving sustainable development. Understanding how digital technology affects energy efficiency and its underlying mechanisms can deepen our comprehension of the economic consequences of digital innovation. This study adopts a dictionary-based method to identify [...] Read more.
Improving energy efficiency is critical for tackling environmental issues and achieving sustainable development. Understanding how digital technology affects energy efficiency and its underlying mechanisms can deepen our comprehension of the economic consequences of digital innovation. This study adopts a dictionary-based method to identify digital technology patents from a large-scale patent dataset and employs a comprehensive evaluation approach incorporating both subjective and objective weights to measure digital technology advancement. Building on this framework, the research uses city-level data from China and applies panel data models alongside mediation effect models as core analytical tools to investigate the impact mechanisms and effects of digital technology on energy efficiency. Key findings reveal that digital technology has developed rapidly, exhibiting distinct phase-specific characteristics, especially after 2010, though notable regional disparities remain. Robust tests confirm that digital technology significantly enhances energy efficiency. Nonlinear regression results indicate that the marginal effect of digital technology changes dynamically across different stages of energy efficiency development. Heterogeneity tests demonstrate that the impact of digital technology on energy efficiency exhibits typical heterogeneous characteristics. Mechanism analysis shows that digital technology enhances energy efficiency primarily through two pathways: green technology innovation and industrial structure upgrading. Further analysis suggests that regional convergence in energy efficiency is objectively present, and digital technology actively accelerates this convergence process. These findings offer practical insights to guide policymakers in designing and implementing digital technology-driven strategies aimed at enhancing energy efficiency. Full article
Show Figures

Figure 1

34 pages, 5480 KB  
Article
Metaheuristic Optimization of Treated Sewage Wastewater Quality Parameters with Natural Coagulants
by Joseph K. Bwapwa and Jean G. Mukuna
Water 2026, 18(8), 885; https://doi.org/10.3390/w18080885 - 8 Apr 2026
Abstract
This study presents a comprehensive multi-objective optimization of sewage wastewater treatment using bio-based coagulants, guided by the Grey Wolf Optimizer (GWO) and its multi-objective variant (MOGWO). Experimental coagulation data, employing Citrullus lanatus and Cucumis melo as natural coagulants, were modeled using multivariate regression [...] Read more.
This study presents a comprehensive multi-objective optimization of sewage wastewater treatment using bio-based coagulants, guided by the Grey Wolf Optimizer (GWO) and its multi-objective variant (MOGWO). Experimental coagulation data, employing Citrullus lanatus and Cucumis melo as natural coagulants, were modeled using multivariate regression techniques, yielding high coefficients of determination (R2 > 0.95) across key water quality parameters. The optimization process targeted maximal reductions in turbidity, total suspended solids (TSS), biochemical oxygen demand (BOD), and chemical oxygen demand (COD) through strategic manipulation of pH and coagulant dosage. The single-objective GWO achieved significant outcomes, including a 96.68% turbidity reduction at pH 5 and 50 mg/L dosage. The MOGWO algorithm identified Pareto-optimal solutions, such as a 94.2% turbidity reduction at pH 5 and 72 mg/L dosage, and a balanced BOD reduction of 52.7% at pH 7. The predictive models indicated that optimal treatment conditions could reduce chemical usage by up to 90% compared to conventional coagulants, resulting in potential cost savings of up to 30%. Moreover, the algorithms demonstrated rapid convergence, averaging 200 iterations, highlighting their computational efficiency and robustness. These findings illustrate that integrating bio-based coagulants with advanced optimization techniques can achieve high treatment efficiency while reducing chemical inputs, thus directly supporting environmental sustainability by minimizing sludge and secondary pollution. In this situation, the wastewater treatment plant will focus on resource-recovery systems with less or no waste at the end of the treatment process. This approach aligns with circular economy principles by promoting eco-friendly, cost-effective wastewater treatment solutions suitable for resource-limited settings. The study offers a forward-looking pathway for environmentally responsible wastewater management practices that significantly reduce chemical dependency and contribute to pollution mitigation efforts. Full article
(This article belongs to the Section Wastewater Treatment and Reuse)
Show Figures

Figure 1

Back to TopTop